var/home/core/zuul-output/0000755000175000017500000000000015145065601014530 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015145077106015477 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000310423615145076771020274 0ustar corecore}ikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9GfB ?~s6b}Wߟ/nm͊wqɻlOxN_ ~𒆷7̗8zTY\].f}嗷ovϷw_>on3cvX~egQBeH,nWb m/m}*L~AzHev_uαHJ2E$(Ͽ|/+k*z>p R⥑gF)49)(oՈ7_k0m^p9PneQn͂YEeeɹ ^ʙ|ʕ0MۂAraZR}@E1%]˜(O)X(6I;Ff"mcI۫d@FNsdxό?2$&tg*Y%\ߘfDP'F%Ab*d@e˛H,љ:72 2ƴ40tr>PYD'vt'oI¢w}o٬owko%gQ(%t#NL֜ eh&Ƨ,RH 4*,!SD 1Ed_wkxdL3F;/u7Taqu5Ոӄp\2dd$YLYG(#?%U?hB\;ErE& SOZXHBWy|iZ~hal\t2Hgb*t--ߖ|Hp(-J C?>:zR{܃ lM6_OފߍO1nԝG?ƥF%QV5pDVHwԡ/.2h{qۀK8yUOdssdMvw`21ɻ]/ƛ"@8(PN_,_0;o_x+Vy<h\dN9:bġ7 -Pwȹl;M@n̞Qj_P\ Q]GcPN;e7Vtś98m1<:|a+.:a4nՒ,]LF0);I$>ga5"f[B[fhT/ɾgm\Sj#3hEEH*Nf äE@O0~y[쾋t=iYhșC 5ܩa!ǛfGtzz*з 55E9Fa?Zk80ݞN|:AОNo;Ⱦzu\0Ac/T%;m ~S`#u.Џ1qNp&gK60nqtƅ": C@!P q]G0,d%1}Uhs;H?)M"뛲@.Cs*H _0:P.BvJ>mIyVVTF% tFL-*$tZm2AČAE9ϯ~ihFf&6,֗&̴+s~x?53!}~Z[F)RH?uvͪ _5l *7h?cF_]CNnW)F5d,0SSNK9ް4:ÒozsB<^+鄌4:B%cXhK I}!5 YM%o<>"ہ)Za@Ι}YJz{ɛr|hxY/O$Zøu32EʉD'MS1}t i:Y`cФIX0$lη˽`!i:ګPSPٔ3@5;ȕ}PkڪH9' |":", 1Ҫ8 %lg&:2JC!Mjܽ#`PJWP4Q2:IGӸۡshN+60#:mufe߿~Y,iǑ wVq*T+ w%fx6 %u̩1hӰc%AYW ZY~a_6_yWf`rVA,f=A}h&V<UKkZ{iqi :íy˧FR1M͊ :PZ"ρ5M^kVځIX%G^{;+Fi7Z(ZN~;MM/u2}ݼPݫedKAd#[ BeMP6" YǨ 0vyv?7R F"}8&q]ows!Z!C4g*8n]rMQ ;N>Sr??Ӽ]\+hSQזLwfm#Y~!%rpWMEWMjbn(ek~iQ)à/2,?O *Xa>EE衢^}p/:F?}bi0>Oh%\x(bdF"F 'u Qx`j#(g6zƯRo(lџŤnE7^k(|(bƥYr猸p$nu?ݣ RF]NHw2k혿q}lrCy u)xF$Z83Ec罋}[εUX%}< ݻln"sv&{b%^AAoۺ(I#hKD:Bߩ#蘈f=9oN*.Ѓ M#JC1?tean`3-SHq$2[ĜSjXRx?}-m6Mw'yR3q㕐)HW'X1BEb $xd(21i)//_і/Cޮm0VKz>I; >d[5Z=4>5!!T@[4 1.x XF`,?Hh]b-#3J( &uz u8.00-(9ŽZcX Jٯ^蒋*k.\MA/Xp9VqNo}#ƓOފgv[r*hy| IϭR-$$m!-W'wTi:4F5^z3/[{1LK[2nM|[<\t=3^qOp4y}|B}yu}뚬"P.ԘBn방u<#< A Q(j%e1!gkqiP(-ʢ-b:dw>w3C=9k-{p>րd^T@eFZ#WWwYzK uK r؛6V L)auS6=`#(TO֙`mn Lv%7mSU@n_Vۀl9BIcSxlT![`[klzFض˪.l >7l@ΖLl gEj gWUDnr7AG;lU6ieabp៚U|,}S@t1:X _ .xI_7ve Z@7IX/C7@u BGڔE7M/k $q^hڧ};naU%~X!^C5Aw͢.@d!@dU}b? -ʏw |VvlK۴ymkiK% 0OFjT_kPW1mk%?\@R>XCl}b ,8; :.b9m]XaINE`!6uOhUuta^xN@˭d- T5 $4ذ:[a>֋&"_ }Oõϸ~rj uw\h~M il[ 2pCaOok.X0C?~[:^Pr򣏷y@/ڠ --i!M5mjozEƨ||Yt,=d#uЇ  l]չoݴmqV".lCqBѷ /![auPmpnEjus]2{2#b'$?T3{k>h+@]*pp桸]%nĴFԨlu |VXnq#r:kg_Q1,MNi˰ 7#`VCpᇽmpM+tWuk0 q /} 5 ¶]fXEj@5JcU_b@JS`wYmJ gEk2'0/> unKs^C6B WEt7M'#|kf1:X l]ABC {kanW{ 6 g`_w\|8Fjȡstuf%Plx3E#zmxfU S^ 3_`wRY}@ŹBz²?mК/mm}m"Gy4dl\)cb<>O0BďJrDd\TDFMEr~q#i}$y3.*j) qQa% |`bEۈ8S 95JͩA3SX~߃ʟ~㍖›f!OI1R~-6͘!?/Vvot4~6I@GNݖ-m[d<-l9fbn,'eO2sٟ+AWzw A<4 }w"*mj8{ P&Y#ErwHhL2cPr Wҭюky7aXt?2 'so fnHXx1o@0TmBLi0lhѦ* _9[3L`I,|J @xS}NEij]Qexx*lJF#+L@-ՑQz֬]")JC])"K{v@`<ۃ7|qk" L+Y*Ha)j~pu7ި!:E#s:ic.XC^wT/]n2'>^&pnapckL>2QQWo/ݻ<̍8)r`F!Woc0Xq0 R' eQ&Aѣzvw=e&".awfShWjÅD0JkBh]s9Ą|ק_;%X6Q@d 8&a)a.#ۿD> vfA{$g ăyd) SK?ɧ ۓ,j|z6OSu;BKŨʐPqO K\{jDiy@}b|Z79ߜih(+PKO;!o\戔-QB EM;oH$$]?4~YrXY%Ο@oHwlXiW\ΡbN}l4VX|"0]! YcVi)@kF;'ta%*xU㔸,A|@WJfVP6`ڼ3qY.[U BTR0u$$hG$0NpF]\ݗe$?# #:001w<{{B\rhGg JGIެE.:zYrY{*2lVǻXEB6;5NE#eb3aīNLd&@yz\?))H;h\ߍ5S&(w9Z,K44|<#EkqTkOtW]﮶f=.*LD6%#-tңx%>MZ'0-bB$ !)6@I<#`L8턻r\Kuz*]}%b<$$^LJ<\HGbIqܢcZW {jfѐ6 QڣPt[:GfCN ILhbB.*IH7xʹǙMVA*J'W)@9 Ѷ6jىY* 85{pMX+]o$h{KrҎl 5sÁbNW\: "HK<bdYL_Dd)VpA@A i"j<鮗 qwc&dXV0e[g#B4x╙✑3'-i{SEȢbK6}{Ⱥi!ma0o xI0&" 9cT)0ߢ5ڦ==!LgdJΆmΉO]T"DĊKٙ@qP,i Nl:6'5R.j,&tK*iOFsk6[E__0pw=͠qj@o5iX0v\fk= ;H J/,t%Rwó^;n1z"8 P޿[V!ye]VZRԾ|“qNpѓVZD2"VN-m2do9 'H*IM}J ZaG%qn*WE^k1v3ڣjm7>ƽl' ,Τ9)%@ wl42iG.y3bBA{pR A ?IEY ?|-nz#}~f ‰dŷ=ɀ,m7VyIwGHέ 2tޞߛM{FL\#a s.3\}*=#uL#]  GE|FKi3&,ۓxmF͉lG$mN$!;ߑl5O$}D~5| 01 S?tq6cl]M[I5'ոfiҞ:Z YՑ"jyKWk^dd@U_a4/vvV qHMI{+']1m]<$*YP7g# s!8!ߐ>'4k7/KwΦθW'?~>x0_>9Hhs%y{#iUI[Gzďx7OnuKRv'm;/~n-KI`5-'YݦD-!+Y򼤙&m^YAKC˴vҢ]+X`iDf?U7_nMBLϸY&0Ro6Qžl+nݷ" 㬙g|ӱFB@qNx^eCSW3\ZSA !c/!b"'9k I S2=bgj쯏W?=`}H0--VV#YmKW^[?R$+ +cU )?wW@!j-gw2ŝl1!iaI%~`{Tռl>~,?5D K\gd(ZH8@x~5w.4\h(`dc)}1Kqi4~'p!;_V>&M!s}FDͳ֧0O*Vr/tdQu!4YhdqT nXeb|Ivż7>! &ĊL:}3*8&6f5 %>~R݄}WgѨ@OĹCtWai4AY!XH _pw騋[b[%/d>. !Df~;)(Oy )r#.<]]i-*ػ-f24qlT1  jL>1qY|\䛧\|r>Ch}Ϊ=jnk?p ^C8"M#Eޑ-5@f,|Ά(Շ*(XCK*"pXR[كrq IH!6=Ocnи%G"|ڔ^kПy׏<:n:!d#[7>^.hd/}ӾP'k2MؤYy/{!ca /^wT j˚ب|MLE7Ee/I lu//j8MoGqdDt^_Y\-8!ד|$@D.ݮl`p48io^.š{_f>O)J=iwwӑ؇n-i3,1׿5'odۆ3(h>1UW蚍R/{&Ά+4*Iqt~L4Ykja?BHȶb> 8yݪkIf-8>V#ہll/ؽnA(ȱbAj>C9O n6HNe">0]8@*0)QsUN8t^N+mXU q2EDö0^R) hCt{d}ܜFnԴ.2w⠪R/r| w,?VMqܙ7;qpUۚ5Tnj ۝jlN$q:w$U>tL)NC*<` `)ĉJآS2 z]gQ)Bی:D`W&jDk\7XD&?Y\9ȢG:${1`+i n8=%Ml%İȖb7AޗuV3A7ำqE*\qb'YpuHƩҬV nm=Ɂ-2=|5ʹ zi ' ׹U>8bK0%V\ t!Lku`+]c0h&)IVC)p| QUA:]XL/2La[Xѓ F;/-rtx-rei0hE˝ݸDt#{I} `v;jUvK S x1Q2XU&6k&lE"} Q\E)+u>.,SzbQ!g:l0r5aI`"Ǒm O\B!,ZDbjKM%q%Em(>Hm 2z=Eh^&hBk X%t>g:Y #)#vǷOV't d1 =_SEp+%L1OUaY쎹aZNnDZ6fV{r&ȑ|X!|i*FJT+gj׾,$'qg%HWc\4@'@—>9V*E :lw)e6;KK{s`>3X: P/%d1ؑHͦ4;W\hx锎vgqcU!}xF^jc5?7Ua,X nʬ^Cv'A$ƝKA`d;_/EZ~'*"ȜH*Duƽ˳bKg^raͭ̍*tPu*9bJ_ ;3It+v;3O'CX}k:U{⧘pvzz0V Y3'Dco\:^dnJF7a)AH v_§gbȩ<+S%EasUNfB7™:%GY \LXg3۾4\.?}f kj· dM[CaVۿ$XD'QǛU>UݸoRR?x^TE.1߬VwխmLaF݄",Uy%ífz,/o/Z^]ݖF\\UR7򱺹...m/~q[ /7n!7xB[)9nI [GۿsH\ow!>66}եl?|i [%۾s& Z&el-ɬeb.E)բA l1O,dE>-KjLOgeΏe|Bf".ax)֒t0E)J\8ʁ,Gulʂ+lh)6tqd!eó5d ¢ku|M"kP-&ђ5h ^pN0[|B>+q"/[ڲ&6!%<@fpѻKQ31pxFP>TU?!$VQ`Rc1wM "U8V15> =҆#xɮ}U`۸ہt=|X!~Pu(UeS@%Nb:.SZ1d!~\<}LY aBRJ@ѥuȑz.# 3tl7 ]وb Xnݔ[TN1|ttc‡-5=VrPhE0Ǐ}Wd|\aD;(;Ha.]1-{s1`HbKV$n}Z+sz'ʀ*E%N3o2c06JZW?V g>ed\)g.C]pj|4逜*@ nBID f"!!*7kS4޷V+8弔*A19`RI/Hй qPq3TY'퀜+/Ĥ'cp2\1: 0mtH,.7>\hSؗ΀ѩ آSNEYdEcaLF&"FhQ|![gIK v~,Jc%+8[dI368fp*CDrc3k.2WM:UbX[cO;R`RA]d+w!e rr솜[/V`+@;Τ`5d0ϕ_Lع`C"cK>JG.}Ε00e>& 2䯫vNj31c$ i '2Sn-51Y}rE~b>|Ď6Oj~ebIapul9| 3QtUqSCxTD7U9/nq.JYCtuc nrCtVDƖϧ;INOKx%'t+sFUJq:ǫf!NRT1D(3.8Q;І?O+JL0SU%jfˬ1lމZ|VA/.ȍȱh M-r ~[0AG꠭y*8D*-Rz_z{/S[*"꫒?`a;N6uilLn<Yllmb rY״͆jqTI!j.Pٱh s!:W_´KxA|Hk1nE6=W|$O -{]1Ak$ ѫQ6Plp;3F$RveL l5`:~@c>q,7}VE-Q8W70up˳ A¦g/OEU:غA>?=CۣPqȅlW11/$f*0@б 2Dݘrt +qrx!8 J&[V =͋A,z`S,J|L/vrʑ=}IhM4fG(Ȋ1{TT%41Oa{W6/DTjg&8dvjg^c )YvlCV3b&[;h6G\'Hb G"1F&GI Ҹ̳'[Jd :H0VX$Hǻ|Xՠ:RaXu U!JʀwDYOM9XI.W$$xD UASVssk*8+nOOyPH?>Uaj?o"kG2)x{AU).& >8Q򗤯iQd 3;CQ+ A6|ww8 <"X.R^.aKo%&"#!9w}UȨcp>NmvL̦g{f3Lik̓: zԂRG,(o"bDכj6\x?B?i2y\[w՗Xo5 וՇ0OQoXe]ݍռ:f*Z-u^yl$LZSQ&ݐR]'FԔ_BT~;̒lUq=Q?e("/@L|7u/K x;K?W0:+.%=z.S -#tr;TyT ŊgQ_M:B;KxyeKlk{lߛbĞr0]m{rK6NQč  ѳ4U^'Ui WlݟwZkb6-`OL̋oL7^ONooNO䢳(<dU6Bq(,3&̛bFA!lw2/bqC9||vhDb)J8@]R0śBuPҜ ]8;vto <:@"F4wƗB/W<-#5uN[5z1x a O{$|SG(0`Qw4=@턾xY{; dЂJ3;lRH>7c՛?(P0( `]l1y m"'JDGToU@EnⶅZt9>ㇷ_B8P[Ʒy^jE;KPP>jnQ;` |}ۓ)@"ngp@#>RA|UqzyK>QPN#hoڶSE@g`KL pkX8mu"lJY_3V6é rlrKDo}*D6WOhJkņK# Qa 'rORøXɺ^n5f}2ߖb6&rK "Ӽ.OS<; bb >[N}ԝ-QS1!6BeH*N[Jt*D6qvUsU :zQ{R54E)cgXܓ䣏uJj$h~e:K^Hi 00 #Rr=Û'HVi5h%I~)"* u4 (2˳4o*C]<{u1q)e<"F;E?<e*աdV,ә֙PP-(gF<4ώͺwq8: vgK7ͻֶ3 j,]!@h.* Ad6@8v362AZ1R"Xqte}(B2ulՉQݩ>AYHR6JƁK38D=~ @pu9S!Gd͘h,?Ŧ\]Bph:-%Jp-fO-ןZr6D^:/<;=jQVuZz$dtX.)C݂֚v%񡘫s66EkZb[dyjThP1puJ Ca9Og}x׭noaa\ЩipuX\Zrq_m{ď#F ̶Џs"0 $<3٦ݵ71@iȳ6ak~󈯥Jt=#~OR%֫$\P"\=.+ OwK^ mt-utȳzcq c-pC5*\ICxֶt7Uѽ>O|6!2ZyzRWp}f[N;O=2Mg ZfZZYu} cV~Wl!v@Z3"oCF']0bX "wGKI!wL]ZU>xM>t;`U9=wm=Ho.{\BUCӖR0ePOqX'n*KhɪRƜuo zHčIKQSWVMQWx Rjl+j7|iD(o1OMk_N%6H5jT-o|e ^-1ZoU#[w{Z' ϕWDrh0EDzZ:o΋<׃(\VF-z"*ՑuZGخ 3qtٝ\׬EDrl]*a57V8l=0bqsCiW͘cG[ ;a:d[PN{ {0|:xߪ:u\GJox t 0uMk'&Q1qɐ%ڤ+MC}*usWPH]DcGQ[yPlRmγĥEa6u[8S$lQ{3zAjF.kڡS ;z}`؉ aGŢPc{5X'jhp-mWO3Ξ>ݿQwg4J1͇G@}ۧ1Gv&}cƨ% UvZ5/)3=hA"EAC!~ C  7h2$_{BӉ\ 9\]q`NѬ{FP3;DLw>Y&kanHe U5`PK)bZJ*v@ 8uPRYnWO9,w峈:06ǩﶧn0||E.҃s}|~{Fw?"v+.|Û3Dԉ\>(s'h Py9c|̎ShզkC^RE8 '#䖊պrin˵I'Pgɻo'sIU>o'uUYp>"xZ ]'ǘ4I/ƚ~ c_D~G#~N)0B1u*)I߲~5%`.{^e 'f4.Ӈ ͧ(>uPlQ7:UoO(<ZSS3&EksY$c\+z[bXZE~x{g>9pp~x_vrm n{ `1А:Ga#bcc t.xZ4M|vCn`_n \N5N Al3Kf/FP~WfqB6S: ~@` AAWЪ/@͋#F WRv:DWT'Lt;%ha`j)@XZ&J6]J!7mcц pn#k12 :\8Bޅt=IA@ ލYhV2[>a_(`Ҏ b p b[͹PD|(&q {S@CR^ёPBvŅNxta/ P8K(kH :9߅n7bVP 20ƒjr)R#)&j)i"Lؙ$2,G8!G4D8X"/P X$Q ̊0Ff"B9ɀU\ޙ@ vISB!TMZ(aL$òQ>ש`/yo;Ejojefg)SթNb-Ι/jttJH+'^^HV4Wp (qbf/Ri9ς7Cߦx; )FsNo|sE'rK(f+Uu'+")KdLuzc/[8Q?p TYjQq [Uqc阠y:ZkFJ1!p3faaw\}s;bBeJRy3jӓ,GbN! V]Iݦꮚei R"`z9yr~~tR: t@?c;qԱnہ 3Q?ͮi-N̵jsjS T%%OA` zYǁz"P;takcn:yEq&[!}.~j}>-`FC(GQF thT f"~# V[BnNݐ۴μV)oiv ̮>1&b|@Xkj^Fզ\Fօ)]}LML -ځ>7*+ U^b }vf<'`!T0`瑃_IZG8C~>>@'\K'Ww>:KY`:#ώ ގqQaVu}1'=JRxєNgɰw}U؟Nsf__7舐^© 2;AEW)Q/DK˷~lJ{7a!e0!バg r&7#ǥ|8Hwh Njms.2. [Pv0 =6Ca{C7cw 9P3ڈzpXh]&CM"縪<0huiѕ`>0k ݭ, i13{ Y@ٔb~\nKCN>&'CuӴк #1:E~ pCo>N[qY/T >@Lu\ v=%WY:{H4)&`"Aj J;վ cY&QF 2>X|,\%`So\lRGt+L+p-ڮނn]v"USxF^#22.oUX^ ʀM!NrArxr+=8k|b0!pO/|$ -T^(O"cjg>͢f?<y.: @KF4CИBa*hqZQ3Q` QHi_.z)<f-&qj&%:?dF10Ce`T4} +>$UQdW)G-4*@5mg7X-x*_7?>YR[Pv0O-lG0uWi& eIco5NΡͯ^Y'âL2w0xgMTڭMB2C_c/^e㬌G6z?W.9KEtP^xɆGfuU߿Z2G Z. 5>]VU <5%i^&&K02N2x@}e,X핆gzŴ{r}Sf(4ʄ'ClʂUx C LP5d]mgSiLj1,[=G ңYHOzs<*vޜ\nk׈Ϫ_jy|8;׽\Sdq8H<*R 6K]S*-)d*`F8gG煷r+ 9YL޺Bi+ _zJaw|qeX(dtڳ "ʇQZ6j&2yEqI>2 󉭎AUHb]e/J? {K.4p34J$$2ZphEpBK>,X0C%|N}FV /7Z:G\~ Zv )dCSEuu}`Ѹ >N!5 td;L5+Lx `wVxP{駅#}Z%2_.^k=vA^y׃;e>K4SWz ueYOFؖp$eIZ;^}P)crą9 5e""o_]c(#pGYaKN>;C=NeA0LGѸ??-:\+O7 W66NOsbt&눶c^hCofDw~çδzM9A謚:~:dR~-M1&8}_S)Q-Z`֯{ׂvU 3`Z'c`(c A|!cٴ0B'*2FtUT3[JF4Vf`SeIit_ϟAgqӽbvqDs|?PQ>Nd_MٷNbڱ?jg?t8T?^~lש2;;X:MZ6ˆZ6Ƨ?aXns(2ĉ3~O۟^؇<w_&㽫B{_+q=E`sD@f~77]p k b~+7 Âk>O|_~jNYm0g;RKG<C2;%vcq,th坦K}(>xhMELZX ሱtuYKiZ%٘n̂Ŧ=y(VcGز3& f(ruLt?-e!T g)j { %o<<)xRV*΂Rl eX:۾mu{q.>G YӐo`+K6Ylk80Xޏ%dp@9֏萕E0}`:eɼES5l@%?wTbHQLi wo*JZ\Enl> W!sV $k29){t ̱5yy1͡f?B F@ȓq8bo=(z؊;1mS\`̽y8i5Sp#lj p8~/\Y?}~cbOs!-r@c33cGTi![1[e^FÔ"c@.<%pD EaWaepR쁵7~Hc=\s6a]P%] +ؠEF\t ^DzQalQ ƅYr]KVNV(XGa U@SvQ ţ] (M(J]ϙ(2$awa,xsYvw:ˤ  KM p c&-kxG׋BeE|"7`)g"&Ua>bRV6!R]Ty,4ImŚ V( I8L1E;_Tz]?>>~^ vuHc!$8e>|09J=C%([)38= /*o"aYr n`!_6mL>Ip4csc_ۉӋ":a0!13G1~cgܳݚ$AQj߬)VfL G :pDNlBA0V^OIifMcr)"|* 5&֪=bA'KΑ(**;P'Z(Q|rQrHcBsKc.t|5~BC<{y0ҠHD1 /)y+X҂ZUJ%W'AwKOXV$1J.0#c8 Y3 sds`2 >2eb,w+18CNT했`de#-g%u~KpҘsM1b,|biu7=<|LabC?o*5KF'z!A-H"y0]>2 F(tZW,cBqeBwXz](ty oOSѩ')$!\Ԑe m B`<hvm: EAhC.V(-X÷-3>n# z0F|LLic8[ULpIc--O%vx7AIj{\s_僤M( 'G#C~wapU$A_^K![RҲ"33G>~ٹ2m5,[xa>?aJ AN7pUәÖ4)[Z&W9%vw|&QEծg1 |="\[rI|?C6SprhH})c߷-Ћ$ySc‡ou=<R׾vQuyQ:oC RSb Bk,~9|]D^GX:$8ؚGLrB4 z 5(KgK15(KBͬ)߼,Z9Ht5g!zգ6U8P50(Bz )qgtd6'Vf^ iB^XPE!8RA:8ة$ 3%pz;(yc^+MkL1öGZ.um],#Xie.xj!xST'9 AR _''d?8mXTţ*9~wDP=XQ8 Vi:BF҇VRv O6}5y>6;_>owq/e=3A IUM/p'D8+kɑW@h]U;q&`Կ1nޭW8>x'vWjaQx%ΡբT"  m>51X(ir5Jno= :,sāD>āHAL.R?GYjۊ~!9T-e!"o 9/IpvQ,AA1#&E딋Q^3p+lnݬn;aouۧF v˩ iwDO+[I{.*̲y1#ͪF .p}sSJb?*K+1&/QiGO4Un9KFKBj k|d`6SDŽz M==a2obd ;ɇDH`m[М'%. gr^ѣ fǪ&-T50f0Q's ]̚T*J';翥p<v"j*A5$ ck߶ipcoP3sa^Xl-@X33yU;%Hܱ|RR)zwصgK僓)&+ h}k:o[:C&qvʆ9Qi6e29R%ŒyX|uZOz OE.Ի >|K^V8bOԁ#?>J^oHmOC;X RLC:$9.Vr)aJ:Kysfꉴ;rҒ #➖u*S=).8=];T{g&jTIpIxѾ1 mYF,(v&I"|27{V1kୃV uf)kX5d]m [AX" c࢘5z4>,r)X4`lrwrQYCO< !v6zͥND|($cH\ \jd--kBݨ"i4nMb,Gr#QE)1-8iVAb,=Qh|b`k0gڟD7F 7xt>U^# w ,$(yLX:s6BqrUgΪ O r/)MUq ཱིԴRIx 1p GGo9U&;-iYkȍlOKGMx˲򋐓˷,H)zg[)ÜmI!% GG- ,uYz m6G3@t10 5.Gb62 ]])tQG%rYָWLAz?$G,gsqz"O~N /&>n}W[WeКɎ37ʼ(ӽ@$A\\/i?gF}~.Z6EO G D&c4{:=bu("dEE͍$,\:U˛2*c|uHZhtn`5z~}gEkl8=׆/G~V@vFǭ?2z@zntz f]+7ӇѵDgo^ pݙ aIpI/ŗexnHO| ПTaJ 폃"FQj(dEf8>|WIy'ho]üstxCPv!?  Zߎfl|t`2w ?FN z:; _{+$d`WT}ʪc%;jf:fu"~9ș$Ja-c8ȼ:\fbӽ}1." ʌk$Fiłv禜ļ.L.=_O^k4AIș"69sSpf*h样Aq#fFq &L0eʅ@f*.c44ѕudbeqUK$ "¯ -g#\Rn^\$p%K+_b!'*NEr?KǢtQqd^ϔ2?/EAzm\JZUU WZQad}YC2Kł,|.?X|>Hws=c/aN{t~GVF0co= U"ƅ:3L὏ I-\3O.kL cл?#(Q.M<3#$aț`{?X*'꼑BKpXK I4jQRɚ#x W (W4 k&MR8iJ ͑7Liš VB+4Vmk8NTޢ Q̛#*\J#*?f L+*6vD|D6R)U¨Ir@渙p,n q7i7J qȆ-hX~ÜJިt&Uh < $3v 9XAkMu&w3A9v,Ev: v?0%o./fG= ?Z&cz!t[ ^I]Ev+ j *^'FYz}r3?rTl0 5Jpޚ-]ٿ'aWOJ:":4OZJap̬VJ$f5uYBG|ZoT o0γ&ŃS N9FXyJMۺ6D}bw#Qp0Tl0'") duwHo$TbIŮrpH&6!k-MD0?&/MXjlBUZ)Ҳm/pSgܞ:ObTkonsTi>ꃓ $1R&D$QNhgULuVs,kz إ`>j [ +k-кG7κURoe7ao)C;Ƭ'-ۡxtqߠXhwu;TfM{.MTxϚW`J•Ur`̛Qi^b~4{_Slv|̞i Z'a,QtտXI;!RZX5j=Q KL'߷%v(,AذM#JyƛB^|Q1eסUSMʣ\$h rD$?nD۬Z?x<_^_^]E nf͟x2^B[Á(T¥b"l4&ER N É0Tc8}p1:k89J~2@gݎ pī ơܛNȒڝ8`K)Ųrr"c.Ay ZGzjkQKV8P҈r56HjnGڅ^P!*zh}G=7ʏRR\WQvO"{EkgIh'1GiTwp9F5|8;I @UK<2,(e \gʱ]qG0g80=?o`\oi{Ws3 nXs;xwD|?2w/hRh$4SX >WTU쳠_Qq`>?/B)C3ه30jUOC4/m.s2{|aV,DYDD6aN;FC/דh4g.e3qZj[*uۢ- Q\`}?tJ 1l$V_Z`=vZ>wL" \yS #,J}n,SḽA\LJj R'N5DC?\ۇј4Vn|0po>6ղUbS`2kRtT$0KH8A+";Tւ K7!y4mXM İN#bӘ+cj lJ0jc'q"R(&ERlxޮrَUtB9셨ٟp1cc4I4IB0I$6.^P_Pi$~&)buUE9:{h9= `6ݾ,r. 1bZ!'DݤgIZ2*ڱZ>4JoG\HXCy3,#άGzFע#ՙ:pĪUϞoo !C:J)k?`M8]:AF'z^gZ5@9v ÌX&>ĠUlHPlSb EM@~j2@QbDbUe'ġXb$UkIR8ʬ ('On #~)Y`(q4֖ c:r3k-j$@[T4AkE yr5i0*6,MwyhD2BT z ]ΦB'JA\ba C(j#.tÙjT˼Sůuシr$<7Fyr5kcU[5/\dR'c:,x̩Jt5L'1T` LM؆JmڼlXmllF6.nlk͑mʶLBbZŴn*tsV: yiJSRrGAh yB\K0!-Sc2$tC YI0g5 nSݱcZhiğD6HiPSğwݽ+327`FV~B]ɰ/uܚ//Q9h7+7*}^/"*pAPOBel}e>&=s]l L곸sC5LQ jop/):_o1{~_^Qx yテ43~;,>5 ÜUV,^r V!e I/vҗ݉DzH;AǦ1\ q@L)X =koGЧP]n⼇E`z<>A#,ZD^J45Cc<]*cޱ֠#̀u HkA] jl eL1xnrJ.9nĺdXPcZvB ӚXg5%xk/ix;BDjaѧa3L#k\0 W3%dQx(:F%wB6 3B`V̙@ y Z^iqNSQC1הvX21[g(X) gpT>WMW-Jf3dnHaYګ\Sq<(Prm\In6I#%l@)6 I4câ-ŐA+=j@}"LI"̐h#b8Djuf uFfVq4FJj =Ag;B2fV;{  $(Áʰ#&BO 2]y.ilhLG$2_ hGrtHpNM#E@@2zRc(O.00"-Q }jh؝|h -$-V~r1 IWh5f0.,;CP[r yC)˙ CN@uq st)d0GR%kpP`Z ~3h$1Q?eΘT8U/C=z-bL7] .(J Zr^+&~R(W$0|2*8xG慑ڱ8PBpfh)X$>Q [6! ) U)[]UqnP㣼">LJ\miNEL1Efܱr8bYB g%Dj K9Ma@0-r l7T5K I(fy&SLo4L7D#MֽLhφ`Z!xZJ5D ȀerK[BK)F:>oQ%vIX 0[y,gA1"Rq+kYD1zpyr ;wl<]o TT˾01}ISw(1 ܾ^DKhŘl-Rڅ'53]}+P4k}(FLc,*Fc\\u; QSI}n{RlT UHhw_FMYXʫ Dn*Oܾm@qRH"IY~jUSq̑6쬔O`a'j~?Ŕx)` {L0Ⱦ]i]L R `W=3FՊ?jCbB$IRsn  b:!|f&Ǝۻ6$VrAr&LiqDp{ļ 7KsUhR} ((gSH' 1fdL$ۙ[smIPQTVӝe$A Ae9i6X/ LXӉ}k3iU*Z4\<|aXNP<\&bi*c]|Ai@FuH8zg@ }ɾ(SNWfGNB2͙;;u STUUX&K7¢ؒh[9 °Dao?⪳굧uf*[ 2Zb&IJlߜ1/ml1h\ F3_wFޔ#{f,8|ֳF߀,Aث7K$*sg? Iue'tpgUMk6=-U/:W\+Qvo;mZzkѷx˫s=y%<ıfߞ۲EIrė{ISNLbqMb\H $VeīNOc|Ķ?w#1<HJ{>jrBϱ]~,Gy ]k2"G2@Lrvm'}h $/QH\_Tk>|DnysiQP롔Ko-%-[e^uzgpVjE]۟QiNobE_wfymL5!ow87 +ިh4r^zZx GeRWF-|DXe 3q1MD"\9ebEkn g3\͏j:f#OkD[5rk-N nX&S./"n>1כngyu{n h Lzj6,~J7cf#tķe`3e("=yZ nk]K*ВP-9JRvCr*xL BL|qJo20k|Z+I[0K:wuzfo"GVxob:kE_l1 ?Ol˗j:τЎfϖ5*a}7̴p o]\6za^-1QigjXyǣvWydGh> wD 3-(;Kk>h P߁mu*4.:*oimk֠S U qk\*ר՜aTOVPd\:91r}xZƺӛ᩟M#rx7[d6F b]{QB\d1qkdfl^dcSU2M3ɓ¹}: }>w \ e 4Qg^k:܊N*<B:!PRƼppaGQ,/2k3ȝ:ϩSAw _ a 4 imY/e/֍R}cY~ !sqg7_tg;F]'gES>(!v1Y9)&Q?rJu#s6G~Wvq87x",fNtĵO!by52@靈i_jݮT^VtJEoxT,|VC5zPxgW_f.N^)WN>dڒ]!fq nܜl sN'o~MuK{woϖ.r2Aɸ4Ғ |KU~8i NZ_jmwvg4Ȭfg{lB~XY^ɸa>s<,'BqͧD{)1XrkJ/RA鉦tI!>K#^ēE>kq}fT1hZ 'm܉7bG>QK*ՂoKU>$// REUR4>E>QVd:{mR&v+P QJRd/T%֚UY/ ^^Q 5"F?}wZ'fTwYc-&ovWb"~w~6I%y(xN&YZHdãr4~Y;"[QpK$9x@OUs-!Jyp:Q0d) yjDUJwT{J"GkW=ho_˴3>nA ΍pHt`)8 E*MKQd@чfȧ\T+5B($€ V%1qŒlqlrYw"ʔ L}AOgT7~gk/w p.wѳǬVLͯ,@FSz%9H Hz9<P2@>E>PԴ&k\ kف M1mvyFOծo]7a#q[Tq0/@@}a&2A?,@Tmo>PՍ W9U^ߧn aTޓdfc Y;?͘$J{IVXH ϖ 8$Q $g-.QɾO4AϗNyDON;5< sF}x&q!x3(*^? vF^l[a` E>Q5G.}=½FhEީ.h:).s&b7?9yn]I!t\lȧK6h,HːKWO@sZv3+nGLĴ hinjD.VtEz_Zܮ6Q48޲ZKjB?_bi#2G"P6d np"@u4ӥxm-U/] ʳa3Q|yb2>=h^sMV(˱mbjh;j;a FpyQp/Ɔ6u7n_W=3<'9'k7,Ib(hZbJb\3zRf@Fj@n|qLኒ\;s=l~6 E>Q#j౽[![%1984e}P. }k<%aq(UE*nF 9 4I V$w5!-5q( xK$(*Mh(n'P㤏=I jVjDʍѲKa*W&Iӊ F$DɈ,t I<ԟ"T+Dx@F*i$Wɿy l: ryiA{p{0_fտ!_ )XM8 ܜC89p~/|k.wB>wjDŽR3s,2A9ϿI9m#ZBnxhXȄ3frno&hHniLAqUX>]UTUEPTij"G#E>Użd'96RK 6r(fk1v7+Yxmr9p!PUjkֲb}f~ɇm]E5b5'L1~sTƴPP17[a6qQj-quu*iVP^dN8c9FOUyYFqB]*/3w^̄EpVxCqJU"d"1Tzǁbw-Z,^+7et_*ne)v7Uϣ~D+aAO8%[<|9EOpY=ŏvâ[70Fik^PgQ^MbZm'[%  eԾd\6O(+ [,5_cr+’DzWvD.H ~*=5 T*/Vs}SqTX KYV/r]c;werc^䐏) K7VU|nL&s= Dn~,*ײe5mv6wK^U>yG`o~mW bJ(2'X"i?"F\I'SSxSdcIuȔ"yJ3Ђ`':8}E>UvFw.V=M×dywXU5ǢZb-8[-@܃a͟y*^GoQxO7E>U6*69>-F%c6!08\DaAj֕Ik0O.|+v]烿МW\y۸[\=.]gTDmsv;JqGz6)cx@F^28sB&(ڲ#Bi*}pDPNpYRh{OG lJQMI>Y4 kbI#1= A<"uz{l5GH&uР4]dAY7@zbҐ'ZȀ1_0C>UZ5 J]1 ٲ7"-om\^iUE錁]t@F˷be^>^k*T9+ ~YL3uNo*G>Ub!Oj)G^dU{@^/#j&ڡa60oVBYm|!xJ gT:l5a NʪV쵱wNn=8$p <PU|Ao#:QCd4{Y09] e7wG24 "= $ -oБHZw>,xrR׽W{EEM{/UbNqB~m&Z.GY`\+э?S!:K\JCd8 _*SrZ7 &ǰio%Cb8h/>; #uGG D>!JL}2^{ $VyʑtDOϫxWm~ ݾtPne~\}m/y8,;aagv1ݘٗ)Q)we+IU /)ٖbKqFo=<'\x.3fY~BʜW|;|?G#s2f=`+'i2HLUQ/ d0=I_ԩ>#-dN`{7ROJ)4$`kZ>`'sapšՋ P`E>(/CN`aDieuRa&:y] ||V_)w‹#܈P³˭բwߵtyk58w?yi>Y?||M.R¦T6]Jcx^2,YfӏK_~s l&w&;wS}th]o0igiv^B^e @ .5yWA`.lnJVZj9 0e|;Ac(q ݪ5yqD,`yDwᬷY*.$kJS0-Q|;DAPD[6r jH∂:X#|.D!CmȂ(e'\V L(Si +&`X05 LGk 2.\%[D *D-"#JqH3 ߪ`|~ﶖz+uB m-Yv/hg u(=>vo ! ya^(v!`8FD//z !zz;o3[(ɏ2@pxEP}sW=*\l7`Bہ "Ŗ,rn/((Qpq>W F o(hFDD KqsBb% rj钼f?VRo%`NZB۱e#vB&2ANhdx QU?{j?fRe@In/ 0z:#tgfQ*l-6H2K ov)+DDeIbgzl7(Bۡ"-9LKMɅBaⴌ}4F %SAl_!w J 1\8"SpG $+[x } l9v`g =rNyJxnS)3ZDT/? s@ Y$#m+3H&E+T:(`Kwhh 0v' ģnIJ&QrlE*` mx֏ﵱMcB<8I *'BTBW6-g6v־A || 8p=W=L"}Ua4"el)]9uAE't D 9s`4ԯ,2b Bˆ;đa\94&RcE)$;60G !mVƯצ#oG)T9Wh帱2gv:ro뇕aq706 0c! q ¼5+pa-w~<c,yl9{&JB"*6?v~:yD0UN'p<"))* soWwU;kj~y(yJQnNyxP}'3-WgY, PxFɅךd+E{$k/7T_7մt.`T*: A}QSD+]Jh6u'i6C 0vޢI*EF &߉Z[]a sGkFUa`Q|HIGTq S j>W;Ǵ[+0_geBldAT>qPw]"M\MT7Md;,g@܌ΐE9/+8 Kf}㙬\?v@M%BOg\|}*!/cQ̦'ң]C-0k$\=&̛@t*RUt!R n14%4[-Rll1t ld؂ *f@T^(Si䵍ywJIa/KuED]7 `N9Ej$_.VGë d9ZW_'lN)cxB EȉzR|zZSjg09$8a!Х}Qq?Lx{0׵.a6CGFHn\"Q`潔U.^VGيkUSRQXzݼW[ɶ17`17xN;=TdXx]bqh8D5)8hz<%ؕ:=j|xA  {Gi~FTN^oo e)uP71ҕ8Jì~XB:D 9\t'% EyrWR8I9]M$TOդ)9/sz\n>>0_etmdҼ?~{s3iQ&lGgAr.X Q& ߥj:d0Z՗W-G栦/NO`0kR* WٗXMh CM\\s=| qgס_ؕ4FlZ#K^po,ƮCThZ.'QedmAvvԢWb$WFɲA$AeJRR؈E!&2" W&dJ@. W)JYT>VJd孔YGIDJo[$#q BOв[9Qb #vIHQ*YJT 1iixqL|1A/U;1!]T8ұpu-uredb}'"=Lj z:![pBN/0pg9.U_&~9GEX]sJMOxjȺʢ(Z [Ȣ6L˲ ?qR&O&:d!d' ""."OYdd;$s3_4-a6 *$I=X'#ц:y0yd$(a[W٧IПh^1"&`aVzeu8j{Z%jC,\|&7߼}n/_I?xNfk챓>x9&?kchxv`_G?O~zfG;Tka 9 )<|ͺD@ Mb7NZ-N nuFNgDu+{MwoO;bJʇk- e"0_S ~ߗm{i>{" oFH c¦0趼3$lo gcyl=L)LQU h!c^>ru[`E!>QK6O\zK\9 ,:Kl<,{E|d_g$g 4 : !%#nGt Vw8^=1`B< Woh ȨxQEt&EފvШw%e֊y=q\QT+S a>Qxu׆z׹i  M:Xz$>NWuźYq#. $EPUyd vXʑ/ NYӮ!WFYPVkH*6t)N(o,wWc]z)/7r5}0 &rWE8{"ڿ䒨n^/u~moft._5íB=Loxpcۭ_=#5&alHYRڪRÌ S> 3EZxh&Iߟ]Ky-X†9w:09 DM9zB49k5r`[Y¼2zB(0PQ(%aAhgm-b9J0hWiy|NA 25xD1\l,:<*頷"ʅ/x uݧ:` cpV*mbi U>F)*įjj꓇d͡+dϿԇ=LJJpCP-,4paOވashȅ?i=~rH31o89 ȤKT$Ta Mv˩ۛDd/:#,|zam9MVpe~XW&\0ꝠWy|IW7wѮ|U\q:_g1"g_q1Pk7!5Ք𩌄lU_VW `9dLy2zŵPipMf7ǰvʣl9[^/;v!uɪjYW뉽I`Rh_]#Uԇ.JmQxWonTէ1&3;\eiz[qgNgf^>}HdgaY߯'x' ˯nq|xgwOP rZwiC.HdH -6zl꧅ ;eޝ4C(] s!Sy%#5F {ϔrXDo3B2tǂ&>Knߓ_Bۗx/!2Fx 5-m%~ OGewK<1Dc00Qo׍}_r&1vzx#Ɖi<,2:a;t{O1.w/cjv|-Z%}OWE/i.r% ­) *LVm]Tuõ0JL][#xVJ`uƛw!2%FEaDQl-+~۶uR.R0*q]%2nr } C $ /|D2*ϗ,/C|pBN 5`lDܯ'|2Zٙ/_vzhU=>\%[e\٫Yxѥ8E[Z1^4XbTZeAJ%(k6~mGo/ގgej`8WP4DR)dicDˠ@$Yw0 'v0}}mh&)Oi"B٫``MZc2>."|q?\3|\6x9/Y*gWO wW'ˏBYGdq+kEY`mU1!Ϋ q 0SY xv8@WYQ=ҦWn &Y'|÷۵v x #-F)\,GxZ$s۩PsHYן|jdɊJ([Fl˚Dm88%Dga{?Y|:q"e?R8<!QX*viG]IoA, f, ,HJ ` Tb\X==U_[HJjoÉJIb9e8!h"f%u(An6dSЀؓ6x[!ھBW*FҒdX"( Yr@E_TNg~OpmxQ)p2FgI.PSn 췔f/삖j[y<3u2-V椓0A)g6R08۽d*9(f"0tTKZ"'mSzsԔ`8 w0*'k+8ԕű$)YQGo%7K(CQb<Ȥq |@9SUl\)O^xՒ0Pm%ߊp\,tIq8 'tޖ$Pʜ(+[,"< :\\2glL梾%HQP0 eNS"H4N}wZIbS9_#Mk eLH%uY% NU e!yInw뇇EF5:}㜨[I ggQ`_P(5`@ܢ@ JƪIʌ{-eTO`%i(ULu8'[R _dFLRx*|tj5ƍspq$5qTr?waktdVES1X}I(s^*`v/g߾ދۼI.nNvJݰV/UTÁ ~9}UvLLC>s,lKĜP:y<adFH\P%!%e`47D R`(/M7R\PIi2u,5ŀ.*(P(X'B |rAsx6.5>.5eiWƸR(ɭ'_.g.Ia-y& Ok'əCN#q2XwwY][(mv'o_@#a9)Fza^)_j\j:T2NZ\xRZ6š.˲l`Bץid٧@hͲ nܴ&V\HΣ(<=^&VMPkr((7e,$G:P~k(.&^X|q A#0#|Ir`-{KN1͖\Rg6>}p@r<4 s9uTbeYRRT4;7ɁBRr1 Z+gyK%dㅋޯӵ<9_tb|RYz"^uHdC]UXձ9\Qit!/a @Q*āqŸ۷gOIM{r-qƖasGt]ImށmE"QB `-0(<0 ˇRmi .AyFBud6;50z0,@_ / DF%/bVns3B_ B%i#RR2=hҿb}KGKR #_nnBf' 4 gCadŃ余ߔX_c~?`(>W% 0`R쐑#̄%.Evڃ.|_S`p)?Gۀ[ͼ: I`D&p`80`6t~JM̶ޚה]1[7ޓJPT @E oX-%%:c`p6L!XO >FZ* kA E{g_fŮK''` p˨U{ u-#N(|Q ܠ L-juh_8J *\סbI-ғjә,Cݟ?0Rv̶:oUM\Ss] &1[KԀe10r8B8́'x)28BM#Gޑ3}2tX\b4W`yIfIgQI-sXꣃ=PGN1vY%#>TޖTrCz=x9aAV*mcl7@( 2ĤglRTHs ҿ0z}XtNʗ8c`0pLf86 Ee:y$K'` o@ѧ"" Pc9 v( ʊ 0rz2'yhyFQ|tدlZw`<^&׳ڦ걣%2D. /AYUFIaxXLe9&1$Kc،B)jVr  vtT0E ?5Ȇx-<#DhꁱAG5hJc4ƈaRRYfUI5T$Ze|D纰H kXHC޿ I>c`s;!A $5\%pU/KDh!Zr70FNNW/_:O98Zu@NJr #GQz",`aEm-pa.DڳB=F#c[̠PN; PK&3!4ْ&lޅͺ|# +'` >$f').wNwP e#P20UAנcUƄFHALNF, |cE؁.S/=v`,6ftlR0RX9@8wip3! o,EE"UA,I[#gpIr14rzx2eJ)%!UdNJޤΎ'` 3JWRX{xwXu,&T | ,(rňj6֖D9PS?V;:IO9f A81FK N$&Dc/BQՇ|XuR&>di`\G5տwC!`ދH-Q.{g 1 YA|( #gʛTi!4%#&*o" Ke01 *B̤ R;\Pv10rzzZ/a;>*qݶ́ۏQ$ujA!c`D$)7EZ1R*l}HeN,=\;d zLUj?]2KZܩxPm_q\jn|+tX%*eW=6 +'+Aډ\[FsmuU܊42-nICeh܍"JcXuS),2L]BV6Q}X :ft^i:PPP[jAK6ޕ2VNDŃʎ̲odlz3/=VGTe؁޺_zb 1r2&eq@Aed*DIfL܁w=6+UCkU8"%`Y}}0/U'(LCEX=3S# ޜ bӄ=L1r0[a:HԾGVzt}ùj1rAe86y{CV=rd1rɿ=a:d SZE|:)PH;d 1[ݎ1cw0Vg:Van}|l̖19i~y{;o1:d c19ͦƯUͮCw2nGZ?P;5KcU7~ [U8,pC`FJe 2ӔDHW2Vγc=ÅxN7XoT=l:d >7}Q|@c4ġJ RIP@`,U#mdcvɪ V1r\\%U֟m>Zb\Xȝ~BՃ{5͢ǒau!c`P'A| vIN]-$ u~Oi0I # D4 u`i@)Hiu?;#%q@Ȼe}"?^fޞ4JV#(1fTƒQ$: ̰-aUj{e >eoC;RI·b?V2ݹ&X[߈*:t+qkNVZM+j`m9J Jnnk>&mC@,5f(eE*~߱1rl1D4kVm {HžҌecČw!"C ۊ+޾vX)9V:Vp9 [XORKJkcP_E^/r^?'v窨jӊf xJ9she\$a5' vBuQ+ɼP&>\wEy%IJd\^YSwo|9S(2Yrc:&f-.SDue+`q "9  iL S]2׆ /,W~)ٮ:B'PߴAmqNr+8 *f`RJDcJQ1,TŒԠ721BBRē$Y1,qmTMu6L}LsHm?ӟ0-4V0eX7low~ғLPOR=Q^Mt>=܉ t26(Y?-rd/:0rum5lH8Y}V-'+l#ܖtjWw̲1Z[*PZrQV[SvZǫp+ʳr; *\ê?,,uSo`\|ܹC[ v\*Pju>Y)f3!UKscQӵIl`˞֟O7X”.|#t1z^(> ?tcϠa/V {f.Z֛_`:30=tà19h`*M `A^U(cfN5|9@fidC@&YP@f˱s5/\`!/i !T%<:De817rX($Y)D*Rw3gyFwTQ'Mkm VOI;9ʀwɡɡ_aiF;clmR% Mh8Hq$(4#ׁT+ګBU-y X$L*Eb ($G`a818bTD"k_qb8AB3czk:ݏ=wNֹ6[lo_PWp[?_^!/Ե<[q;?)گ<>R1ڕ1`}oJ7pDpN8׮b^$4%R]K64I@Fc )F$Qd&X/8sڤ` HLy%R 3LaEH+4,a"ʓgC#\{w k֣w03Y?O 먦y}<ԱbP5mӋ?;%xc{·vxkE8n'do`7+4Ǧ"Uw7x;T+MO49>MO3:CB8ywX@͞u@+_Ԩ7ZY̢a>T;Sm_bïҍV+#ãkC-ӘթxtXr"vF򭝱;;)uf_r3XN vy?c}/^B. =T"Wiil)rc5f0A/[DLr}7)r{̾A l{B1qE0,e;-tߛ>6pNHfC{jS{gŒla0>lƇ]}V}XL(Yȑ8TRLJ C4ac#3ΘDp%`]?Lk7>oM ޞ+|& ĂQu#9 m:ZsFχ$ёZaϼuefή. ׅ0_mň0Rq#T&HR")TLc5g~hX|eАPhs.R~|5.\Rʙۓs+zoÒ' ÓXvJ:Jjk+F((XJaJesβX*-x$PC?6IS'LcCD逊ogFG%sV\6lW%sFx81Fۉ9=zCս6ˋ?EF|NPK4*T2Ф2 FÚn⃧1yfmR3޽}sTKN_n^7ܾ`Ra 2DRPgQBV1;a)"!R=F}ЏA?>+ԛ>GP#@}(Q<>GP#@}%/9 CHH9ϑs$I񠄓ntlb=yHg=CahKwtђuPL2̤R$Y2$8 % "Y'Eh%q brWwr. 0+ac ?*nl#|3X]ط~jC^bΤW-ib O+O@Tfw=XߛbM1hpHjUk"yr M $X/>*M^&4#1剖J%hX2!#S0 @\UBxL Ob)D>ql8{O0>w Av4σ_˙Jmme{$%M-kAMu9>IS3ڄM7SeM_.g!x嶜 ^Z0VZ1#EYr2 pKpDB߬F#ME޽55d"Wa}&1g03׺AѨkT-ojJp>*j|FC`COi04`~ 4)4yLO!4bP<g3y3|<8,!,!ȟ=Ҹ v)9Zo[S=Q8lerXyb9O,<'rRsyXc|l}aZ6g;VVbWŹH K\rMrWZ_$M,p=ӿ}!/aRa!CF(x*WRVֽ<%`eTF񌹌b2"2L guXHXɐnÃF Mq~sʋם૎:q'v/.՘gJNc `(3in2:KN2eʜR­AfiC+32R4Vru Maf R(r eQBYʢ(,E# ,J@:,J(ʢ(,J(v(ʢ"E9)F$Ub+6̥jknG .fi %FnlS(FLE;{4_f9iU&Js$fŠD C^.: ydǯQq"&>fߖeNFΪr(.1Bs߾wâ_A}{oh}z zI Bd9A}ϩ)d'7zErn{c`/T-\43*sD~†T şptw赇)ъufXvMT7ҝ3owu19xNW%ttn#k]M ZO~a F{/(F \Ei>bj-{PlMώaķmW\}orJ,&}M? _ԳDWh"/7By̙( s*= (^)BvtOKw;{] mmtkl~k&Ͷ=]ֲ)ɔjIWVݛ7{uJgw]o[ѻc0ij34ю]EFR*Jvl] 0*~j3i[MXjIn,]Ʋ/Lu_BٴA P{ o*gsʜbZAZڪ9Z3yK[b;[nmSnܤo$TT~"Lqᷱ_IpI\G9T(#(c%:+&J`SoxaJtcųC4hnm:m.:< ˳f^*gs<N)1&eZf0O{~ MGRԥp<3&M$DU3,ϷpV]jKn%{`X:mM|vRvx``XEVyDxІ~GP3L繵`\@Q OhNcN~zt;刊*= W_G;xQ&z)V +Q'4iJmKkg?8Ec|4:'"\Q1[3[1hKGw%7X:utd?F59d7soAonjh8S^/Q՘u;RK5](cFLݎx&Ğғb3>nj!D:\{C Lh0YZûɗ&ᣖ qz&eϜbHa[X/yGuT' ϜeiH|Dd]NgKd?PfU[6{R5xt}u&&aƦ'5 O 3脁uŴYf%aI4mMl5uđy,H, KsR&pЛ>J(ND$8K5JYmgNs9#3`gkY$ƋL-XخH;`&#r8NfI7^5G W5lYMw,˶5G0˩BðXbnH&\ JLZWPbfR9$;@|If"3%#RrYHf% 32ZI%g`!8A_ ss-ºGZkOBcy2h ,%*lDQL ,8QIMm?O6HTStlaNIwGal'pp{RoSwro:F*~9|Wp ^[WZ;»3:|廹M S.*sV+.)ܣ  -|Qwњ8D>O.zX9DHzCLmL&.<&S|6[ άNf8hTJVWO@t^pZo#ŢM꾝|Q.aSb}ЧݽO zw otrξÆe~W@O"[6/룢L/a@ EH=>XCۯLmjV#r+23S$<,8jmbbwV%%*NWiB ޲[9"͚//e0_L'ZFSjEaa` tg3JHإ 0b©L+27}?R|[[5ɒqZhtF\dг`cZm<M6iޕ4KYYCPi)TZ BPi)TZ BPi)TZcI~C∏7rcL)Gp+ 6N-2Veʹ]HBS!Us*Ih*5eDdhA #(JH@zoL\w ugNXv9ErFUYi:Љ\Iř}Vc%*1Z8ӄh(5,q8ɔ%*sKIXE c\"pV?d~*&Vfښ#)LU6eΎ =\jT4KqvU)HoIV:$UiMd~֩@*e_O<2uz #}cvXn򵄮 /)qUrg=RQnϞ9S(Vc0,kgUu9)a45tX&b#-U9%8xy5: >!&Og>P~1t:~Гe>P(rÈs4ZQ'bESh,X~5S.1I[ qd&SFti h(&ءb[A}7gÕS"r{FXol8 97ۋuX@˓jd{o+HM.=dh@/uH$3hvpҕ>/?`tB2̼-[ _^ = Y;P1+_(;9:[Iߜm@.e#X~+6kLi|^[(/vSzץ{+'j8~(oj<݅ )`?p=0EL13(}Veٖ0_Ɲu!:[mS1Kq)8Od&Ƹ%kHHH!BҖcﭯ ڙTI;϶4zSܦ}hF c*K.AJa7q LqX:\>I)˒ܥ EUa_<)愥(w+zVQ4+'&WX+@/ Qz|53EGӗ,h%D:,*7F|՘O*hޞ\~wѪ0fP>:Ts+V8|wKxuzͷb't??hψB*QX;& $&I4fa424K<2!< 3)WSGny[57.iV^yޠAOK[ ݦ7-JUxMyNdIguh6[&=F!{\] Z|;mA?NʻF޺+қ!;U97Ynfw4V9ڬ̬"GiΚ>G#ڑh Z<U2@1!fYWsc̲x:dmc$qF3V)bӄPB4/}׏ZHȚn}&g!MFd֨Ҁx*G!d4p3SA ce\ӏ} g)np0bfh@@$D: E Δ'Z9KM B($0f4v4)K,Üe4UL#,8ZI<|S. d1x;ݹϞ+ ?W Vm5U#1 >mYh~?mDkTq┉8@3')&]V!<cfNo5.2G fx}z׉b~P9;V~~cP, qc*ZXxkB>~mҾ]3mr&}sU(_qw"lVAV)Vc)wĪ2tB .tG w_!ٞ5~Uhk xPfWᦦJ5] 3dWz0G#BVJSm}>E/|OgyinoNl 9D,,LG%\JލOojW WGO𗋫-ۼ^{у?nԤNE_y=!7[eJ%$cE…}d4EH4ғWd7; ߖ~b7~Xk9[]/߇}"|wnw⎉g=e|8kQ\h2 A;>.y,!P\d/RGD[F˖7{5Z>IY(qB!VC->u6{] pʝ 2v[wwc1ڶ.ᛆZ.Dcg;Lʩw]n"+5D E- +1aȒ;g\-OVVKtY:?"ˏɮ=p!%5_}o-ވSvF ǤAfk:"TFa6P#E hǨ Hv@芛A;* )yeq41EX.lR-y!Y%!Ojkt30m Z˭>{'i.#$;3FF[+q~~vuRv=(q݃Ipe>m+Ɇ]:'ΕGTYc]2ㅲ Ae]<{t\|M ,\)ͬBkuRdiiTih.^JGtVK=)B6Xed>=T;"]zEmy)+~me"9S\DGT'"pZH,\t-B,n"ϳ8JSmMC%:e1]EBQH @RI9G"="ݎ_ԓHҭ֛^zpյwjnrw~cWɿɫBEN?O&vH;;.a^M~ɱ%yLOn(bTwp5=I\pnrG&4伦Ձp'oQ1|O%IA>Avgvɕ<]])ķi{T%(.0VJO4^?O}gʺOt~ 05G]qЧhkv"k$sud}7i㓏% O7^_]/f=?M~=_M6ooxf?_]vE7$~+m#9u$7tmuaօ-)mʳ>+;|ffr8-Q%=5mnju^kZG/gYTq(lQ$_:iw5$.MgoOOd~|O?_߾>~otGoOiqಫlࣧGM`M+>{C[ M] N ڸ 0-!oH7JۅӸr;x7׳t닇ߜ7+iߝOjfפ|ZݾjgiMj~|$Y@ 7iLj9h#BZoLX[G9:'iZG!ϲ˖)-#,0CD0/=PhqX{\RnR! I#ଧ=(%Sh^p\샶ʢe̔➹2'8&i|a.>Lnt ^rn[:0;4[׹oCb=<\wHXZ`n[ molǘ#dEj肂9,A?C(ּ1xOKDd^+2Dy -ד0 s %2)Qyٚ(5xI&XjSÜL*'nu=a -Z%KfR,DN,!!G~9 x.d=â SA"9}H <'-?B Bb"``JrRK2WCv^¡?B dVZ'9QX02i 'uU -5O! ,sO"`tNh RKD_ӭ/H AB uTh+p)&#_UgjE[{! x-/R.?sF5B+= ouIK+W^KRL3Z@kdسVu^[o)W,L:kg5 $BDž?B @sb3)EhrQF "z  M㚵o][ǝ_<|QϘ>nPKV{09>V2^z-( yF!] $/X<$x:Xjs ST1J(yAP( ˡȱXb]A{幭pvfxA_}礞L{`NvܦF/%4+>Voܼ-~N0뒵XvVB̚yc DROuNhbBRؘkzXT= -[dkcFZ-#R4)`J?B QQIҺB!N4ܓoB xwg׵~ٻu~*I8;]b &do )>\p%& d[c^[ۏn Gq˘z\znt뼤Հ\b$䵀sNe&"{ <; XUŽ0 "$\ x|?%W 16,k1D+x#VpٓBx0 1«zWIo$ÖT ŞB -++r@83/Xg%u}Ph^+ZQ"ň*WϜ9unC(yR6]T1+A<$͸qO΢9m&qLŽ} [Z{YJ_oc7Xͨη'[^Թ2]&ŵƘmZD\S +Yx_/  2 &d5S2 xJ}Bz<o\HWh䑫.XMTOb EO3B[D}#TFdϽf ј K}BrfPg*0֛1K%h{RC(4otk S0'ڦTOߘ!ZZc"Ӕ902N8CalQw~AW*BԤKAxe\l ,}'Ph^ z I:luMJ =PhykkJm-KF%y%Me$J4AL -kly䐳1.QF|QvhLdgvBxVGn#3\3b!„6%$R!Zd@6󄁑I0L ]=:?B𶹇KJ\c" 5OgP(%~ջU#ع(ٲ@e"G((sOT7B ({x%ypVE 7q2*C(4ܼ]o=;oSqϽ~]ї=ПYy^ɡE:s }HWRdS%x:][s7+dgkZy8%"m@a}M&譞.sf\6$~7vQ 7ts3e֪զulwwN tsnv_z/?{Wg{1Q0ž͌z:}3'F}cȌ@ Jp M~i||6jmjfV`aVك*,ۃXnF|Ca^y@%ϛwũ@ͤ>,8 Oah&ѵ%D|>"eUk%W>CRX \sEc~DCuYPiI(J1,,VJ\z\Z}?k\'liѪ5[J1.xs Vpʩ9-&x`cJZײ8$jh][Nrz/^h4<4,0G J@tY$Ĵ:'LEBTm;4UYX΅4*TXL1(hSy X{iL%XWdcZadPg\HLd$du1?Ĩ8KFbu 9L,-YZk (Z,őgn]k;K]-&*߼\ ZΝ;oB!gh՞6瘳z4Z+Is·Sƽ1s?~_ӝykg[_J%۪5 Gb01ܮ]Lf.d &bjogQF&H/9v $cLK2b 3<ƾ<* ZH@W1F: lpNm 9r%K&Lеgj{Uy OežKiB<Xor͇s(dP/kq83 #.m``y0F9 OA+@$҈lqTP&BlPt63gDTpw_`!BJ{ 5.̂}jd*t:\Й:s] Yp$SR!#AT<l-69cbNǃhޗzj4JXu|͚܆%b01b>l\6.@j5sjt8LobKe.:OǕurFWhLܱ=3`F:) aR@wġ=x_R?𭬶\~?(; ]dz͕:Q7,c7Y::O2'3m<0V^Bų8MWϟo0w9sឥ氳T'Rͣ%XnXr|Z]YD?όJuSpNq')88|OwNqɺRpNqiHq)8X,-YZSpa]r? 0FUGS —F3 !`B0r 66S! W_1ϸ+RFpͮv2^x}DZQ=&#ʛ<.s`9Ӕʱ+L:~Joh6fVfe.ϣ*cK3UВsCwhiKEsʅvk>njY(=UۻxJ .S˦Yk.YѬg.mr 0( YcIP8dNu@]%/jc$}QsL(QmG9VRa<&t9lVJX ;PK$g¨]E9pA/w'Wq[+£xoܗ`oޙϳ-e);&(6ʿ8` "@U:3#Q `tٔD'¬/0#Us XQeE # Tg nƨҊzWd9E_s_yH7R[!c@y@#qE՝JFBpLi2Fe*:Jk P/Pgd]xGYځiN$<8tM6.~Y$ا#JS4a})c_ܢ?Q=K~;P g~5KiTgTLp;T0uQXe2pe,w7mT5C:jX<&&V0ի D{XAkgH}Y&&VQR`*ƌƁZ`y9˩WL#,8ZI ?{qqi5i;R/dy %۪( ޙ'2\(IOUGOP |@$&@pGH3BTt$TO8 eڟ4BK!{A1֓dNc!N%RK֧J!_%J* rZwmaQ*ta!U8v.R)&YwGnG#Et4le E( A%atP9O"x*Ўmq3IlOOj25n{O .fthрBwt|}`tlf8x6LBǙtLO-ZĻt y4w#.x۱Cfm;l)I /P=݋tx#g} y(cҍd\Yb q~3o#BF dV0b򓼮B+}i$e}KH^{R\U],ŬRVb>:}2c6?SIǣP %29S"Y;є+=~>o΄S}c8y8viN5y\~Mi?7/6Sl2B5nODoXT塴G/-1AGATΙj@:&oG rHa)VS/5eDhA #(H8ɑnԎVt [DOա{͞84`{uYG}Yhߪ/bK[?jenݼHT8 95SDc)Q"!`%K9 Qk2v\|精W>x]M$Wwq^}V}5VOc%N2a&Gj?LsFRlx4k* .-d~8;UFE2.)Y 5QEI".#p&]'J:v tע%*1!eH:@yM(H7oK'-rekrD\`nj֖wm./aNBSBW6R}SxH _P* v!.+؎1-)Swa"E1͹#-V@s-6 `M@Z@6tl.QxS_SxLM9obn[5;v0& 9<]FoA+&/7zhG ̕u0x _<|nq*<|ܙh;sk hxq[7U ƓOw2k[P5sGaC%k[b|Ku͐f8RT=I9?#7V.6moXԶJVWknK׭z#l$6A}2HI>%%Q:px2[uzKni@~t?,\xJfՄ}0¾8]3-iS_v{Epr QC*H=>(rk_w2.$ (rxD#BB_%byT )#q(QerF}W+ ^Q =3Sy|=\ fS!N%A @zpQhX]ω$/ ةnW]ok>zk<䣣:YlǰU۰7SX^)? HՖpmʤSH% =MHﯞҢ6ö"g4`+$T }We`qro96f >zuWN~WT v qpe[7Wb{b߯ǃϯth5Z1X|M̨3hQ9b0XJBԀDL,VwʪeN/}GNpڮ7|4򬎾?8^ `%OށOӖ3,eXqR+2 "|h5(RgThjQzDDxt}BP&9R[$qBXbr nDD:o1Z,)S.RDZLt9߉,u0*XI٥jfQ M+)%\M82wZ rEʽNֵ MV,, YÝB 0ko%Xa-I?G N#RsBXJ(UKQQQ Ev"IJEN{H>"I&5!f) z#+&ԑαAFFg(%P:Qr:~y$E1\QeQQZc`5jlvVT =t^ݾBS*$2smД!۠LnL )ndSU&/j,8HglVZ3%7,;K Jgo6rS ϗp?8'.rhE00\ uMh<&!&YHOjC#!Ω $Zl")a@`B;FHK? `i' |l!YF"odN{2u`=xDrD 7RkV;2fĎSΤTEQsp \*qΪ ;t\NL#K 3X.D9W.,:@,Zة1u4+W#<9TS2k&(:簳j9yå1=h@Q@-1gzЇ fYI}θc:c-=e&=uiz}})k0|y5,Vʖ$Ċl ,|Ɨnta}s6daT Ƙct|9- =%Zv8EIrZfZ(|S? C!fH Y@ c.SJvnGTSrmS-ѥcMi6A*?HŘƷ W8Q61CoG_0 QJ"BV rkݻdv? ὶEvłMR:,~s~zfҡ 6ݦ' C<;|oyt qA0~};^; tXV ܎^w~ǔ{a%@?Iln9*hɹ,voOGաMVY8|fleIV;$˷eѸA,i󗅌2Re@瀠Y;[y^dX"U(_z؜Xk_}p`wbfAurAͣ[ ȪԔKa]\MV-xcs3UaӼqjurrYN2Hf'`>`x֝ V',-eeTMWf'mIm6‹dKֶ dzn 0!ixmuzCz;-5LÅ~RN~yLi3.<*󱸿aD9ɗf6 '[\~g31ri@Ջ{R*Gaq5bvoi ̈́j1LJ6QdfCG3s\uX4qoi*-46I/(o6ZͣkotrDoVkiɾqs-ZG sKOߖ.܉XOå'+mE|sǰDjh$߷.Rv)M ?qsO`RGSJ}- d^zib; {)䎾/e~6umi鸺5sTzKu;λ) #(P̚KWE5R,~`=74i@H+&*BAQ] l3 xϽU$IE$̧Ĵ}oyI-D\)1"-nf6(猙tR0K5=D{ / Xõ.}mDJm*s>w ܤXKZT#By1ߦ]kXI/Hd0~ay8;lx{ބ0j- a I̭4g":AOި/>q|uhd7ò2CݸNӖ3,eXqR+2 "|hϙ(RgThjSOu$.$n=:8w% u|J[~pQSqލ+"[/wB82 Mg~=n& DxtTM=8 R[$qBXb9keU vh_=hleF͝QFG R*:dZֆ] Uǚ*8Ϟ|幙, O{Xn>x\S.=i =˙%Ymcktj(w!#΍:k%/ݧEzeiUz=[zԇ>@VMA0<$H6F̂Ֆ!)"FfZ  ZHXG"\F? LvJSwy| Ymn.^i>=zVͲ,;oJ0 ݴ0sCTZp)(3f\aB_+m9$\ cHӠDJN^,hN+!ʓH&V vQ_.ZkʳF}^""&P笱 #QBfxn8 krd!ϋ{xesX\- oV~Vޑʱ:s}[ৱ+Ƕmh C;*O!J$K+ E 0m FO`8dK\8%.g|#xz5GXu-k38酣nI/8ɘN2N M>@r1}Uw ,eXC%k2>9~4 2O}"\k҉"w!ǭEoO/w]Td`z]g&J!$H*Z)C݊^{-THƱ`c)U1dZmKP ;ѵ8ڵvxا9(zs{:(6~SpPM9<0G1 *!Z$$&D]ka?;Ұ$iAa*.z*SKPX u!@<"ۨBÞXzuєjIRQ9(Q{pVQ𙐻 kqZkZiԴְ#(EK-mvMZnw7Sq^o`_K3:*5o|͈myݞﶜSZ[Vi! ^S\UHCImyt0ǃ@$S?ÁDVHu mm"<5g K !wìϻ]wwGn\J{?ԔYX=3'n#uRr#g1 9e\^1q)UWtIT"9PR9XV[62it*Ř[TX;kn ވށFwQK{0޲e~>uܺdgw{uĭ4]}9ȼDZ5c lK[DR'ObҚ`ᔪ_w ؘܛ IGs"s\78 KdXpeA,x2RN<5P=-"uH!+L f!%RܑNуImƷ-F=!F#dL uͮ9gmjdY}S#NLs?[N?̾3^/U]T2qu;aMZtsS(̏=^qzbva9^We\Q@vUa}u]ݑQlUnxUMq|ݛ%P7Z|Y٘ ͕xa6s?sml8B_jGm|,l=ؖ|?mV_= \O?]/_*9PɌ FM+W)|<ր%gr'@^iVhg*Zمyc"'`ilDh<~Zg a}=I'maD>fYXJh\O8y*> ]==sy09AeSy'۞WIF>/P|>\}ȞkzEqޝO?[qكaRG/x=-u m"rsCjژŒ[+?(0|>8,wFӪ <]B_f^Ӷԭz8C=h{ވH;>7P1@wy>ZE/g.$$9 F  :L'祈^;VRBqD@sbWwa>畵|;#bF8'dOJ.7K҈>IoWks&nquɦNwoI::G vkx^:3퐭(Fru:pmU}i@IbջR7^-&t3$GJM.A8)yCX{ž^zgeY^ $@.XiHT"X.*@ On1A0*@C 40b hf Jq+ 82Q@s,PY0I8'NjyВ }PΓ)?~kbr^K%uYo_oλ DĞw>Y4II$NW[M.E0XKzDyg흜Tv@[.ÿOoELbv|7'o3B.`ǣQ˸({vT8_b30bQqn̲:K yntqJR9`Dl*=D |WT;mfp͚:eLk;[*Ks._y0|1-i~2~gL5?f'z'd y6o-W?DʖB*S:L;̩_.z# qW Se!Ar:WKo׶^DfzPZ#+P%Z3+a`A+?b IJʻZ)wZX%GgJcC*aS>15L~4辝=(CXF/>Qz|ꀣHm"ur)\ІH' PpTȬ&rWY)W6ZN_Bv09LF . !{WEmzu6Z6us7VmVsc"'Jn,-+ޘm93F2  i-NYªq\⤄H|bd#bZsXĕ9$ q.LZU]`2΅H©I@e$gSG'"<FSkwvNb,BAtq'[ron=0{ s($:vKs@jJ[:osB4i<' s5 &`D%dKL Д wArgux4Ȭ7gwd Nb2*[U[[A2bƥ|DMYT65y%=PzcH 'X/vQ75\S]}Q?ox9".zLN#v4-)]N M>xs1}Uʢ92s- Dr,I]a*YK1$"N{QJ"i͟BS(t"8;;knm]wrEHv:}٪(J!$HNe SZ (X[J؄:Xkמڵ0_K>a>M/A5&JL m`;8Kpb& 0hF&Jw1<ЯHÒzR@yTL!/Yd:gf13 !IFD: {DuױwcFSBQ&gzȱ_2%@In`1<&d;bdIVb,aIJT"YsyĤH[t% 9Ij$f.uc ˲YӲeMۧA)Z:EQXS dp\m݀1g2ҒYdLYd*?ۈމQ5 攉A琷{Tk`/>?WM,gw &_Japן97KV>;u}'Z@tG81ERL[ai Zr.eD}v{Gr'EY>t9]4z`ү2YRָf8P7ӗ@Yo;_m;I>\@ ^Fqpbe}0'HM-a$EV 1 -<ŧ׵NVcݛ SͰ+Ȣw.eSsW .Ss 0I$O#J:U/S$Ȗm6.tu&i6[cr K=wCLN4v/ ( EF<z?j2ʟfek!VWT _ \X,Q` e +gI1ɩɍ}VJWB/iȐ**A1i!FZʐ߱^[i D=7"t% ƃplڱ`^Iu%< `)"kLt+-;|TficFGAG&,,yFǟz3s̷>Z\"_?Ygq=s W‹yG^"w8FPL HUz~vt =403C6L'!ܥg˙=ٛogӆCog,;20蓛O78FZpL DH%I1 6kc^4J)ҊnR]nA w Ӓ~;KV%fi \;O,qDrp^x7ǻ9nwsxwt#ydCF?g|%Sd*L))2%ErT.YorburT.CrT.Y1bf*L咕"Sd*L咩\T.%Sd*L咩\2Kr(*Sd*LrBT.w=Dj" ρ;K6YfoY :hYڵO&U|CUeL#sT{&ndk-I?bB!QCsFR]D Nq'?k|{MD'bE kfOq9';ͧC03VagApMayA g,1b bt0XHۅǥ&LI8MA(,<R5ep6 $%$H#\+hp;tMKk˻Hx2ע;m^ [scKa11rQ*- t 30 &Hwkt t>#Fz#\ hfz^y 1' J@4:$)Ű4N+!ʓH&G1m(ܒ'$þz`ʪU $2D["xh΂2HmBD 5uQe?.KKMhJoPief Q+@0 k0H*s9#Ɯ !4RcjQbpLG˜i;C!bNeN 9>KKTָiU#Ƹӛat7p0wbE hɺJhEM>uLÓ:zV?G)RܱݞQ>wɴ5aXA =Eoeds3k{8^.J`HUtb+X|ޘ?uݗK/0s;f\:74"c{[6˟fR 9<;w]]?5y\)6 ނ_aԄyM~SuugryZb_˥L>z,0{:Msmu׽-Տs;H&#޿|7^e5pcO/麩 GYaJCP'4ɋzG}v7DZh애ս7kӽ ,:J ֑1`RӷN2wQ0+ 4O>_}/}ׯO_0Q_/~y/0)7 ꝇ0I?}u[]CuT ͻ&|~%o}>i DȥzIN^"n\\}_a_rqU8*lk_E>Ԏ4r*eMkN6;+y4]>&xD 1c YYc<8e9DIE&Tʼ +J~D+SpyUkba>I<]h FpQhxJp>xq0bN+:GO՚L<3vvllTJfvNKKpU_:LjDsb|9B! ࠰B`D$ LyˣYDJn#oi4 K B&J&筤M+Qs'Jq~/PeLc\(? Mt7nT?9s"hf)jo2k.]7^T y7k!Z C|Eb61՛g2&4ʴ_A5*Cz ~ᕳZ{17inlfّef؝8L8 5'm yˌl3> SM"Gr݇jo@Jt5z?;b: &cM/AP%bH)BK 1L_.HgmV_S7=&xTˆ^FcD2:R,HԔ1тFQ4`pHn-z<͈=v}+9$Yu :$Kp3"ا45LDX,-,xGI4 G;Yeywi afשZPb=J la<|tg̎1;<cX6c31Z^VL $jC$USRmw%mۆK<]zZ<ȋ Nʝ[qwnֻ";VKO7el=hq6LLTx.cq{Cju/Gl -u.H0j[3{R}j Km'-w*2~7kL8EE&o%ǾyT UѺ[#ݍS 07-¦U'}׊N< 鈁/SҢ4Ͷz"g4`+$T΂#"Uw㞿,"E8.Ȁ}Ċ /u QXbe$2i<2o7ds-ZdS<sWM=fh9]󹹟G3wo0z8 "EjBP&2XN8Bp,1htAk:y-&PfTetɨQ1,%! jCen;9x-z|<'!nwG`,,'lҲ`Wt,K& d{dx^3wNW\-Dl^~taVo3jtfr+G,9T UvfA -/BmO[goˋ_o;b;FRG  `iĒ7b_ԿD"y*u2o 7;͞=sxe3sJXnc,Xm bMpLSc5vcrE(LI8hMALe)A@,8zVF݆ldJ!R(/wZ\\OFKw\aA4G3S)hrZӡ3Ơ m~=q#0r/ҘYU YLZ|8?N'ٻM6 c*A[զ ,Z=00pREȆ1KblXm2g'P\UT39~~pՇ0QWyq30 .]P_݅ \Η"D/L4r܁*o L9JJѢ}>z͵ \$<E9 d9FtsXNYVRr@Q s0yy>O0'zg!P7A.K#(4R@Dށ" ةnSMJM0&z e5[;'0V(7 :ʼnU]xq@懕J/٥Y%|xx 1[VvR0)=^2:'3ڹ K[b .3*P~84ffMQK)Z=h/wwAL{4[7i*Y}0j G9J3dg@6A3*bΌΥGSs P g0s. 8/Z.4.Z0 +$t(3n2}4n_U]jl-h<% mQ="3؞+f*,u߼&mgRGhl"rm`SSݱ$6j>;!tYzvZB}%!%Im\oـ1T)b+t@Hm+g)Oc.[,} *켼M%3SПE=./7!̲!KaIC} l咂 F,˖ \JAݴ|_ Nud<>m/]ll`҄s}ׂZ-VRʶsv١(375KM=XoZX^r+lvrwZ;> *]PHcB)7j<[֑aYqGkZ H:JJho85g{Na}e,ߊ2W;ГI:rx~ڹdڇv)@֙t )Ⱥu)y'2O{a" "0=%"D9q9K3觽#wg{H|IBy mҲ#C㭉=v9s Ⱥr)><DlOj~z]NLپ&Px2v Z7 fszU0 {)L>H-N4LnJo[܎WXȋ4w{駔`w}x|^o]hH);QhL^TZGm:T:y 3#sD )p F10kSHqFs-i AHRVa 2LwZE佖豉h =BZ"Zv~֝V]Hڜ(r#ૂsCxd+LtÅ50 A,AEab 9n1+Dヹ^vF&oxsr8yH^hoAT8tμ|;pU, HB9UaaR"Zv'֝,ǥTxvڢfcb)mS|:rҪTsk^(" G HS\-0I*Ffd j0D ^Ty];rҖ>wzx,\}ɯ ~I'{E#Ka4'ZK(Ir yIը)R:" bwrӲ"88fAkknFeJ/kw(܁V7qUsjm <2TxhCvR@(Z5{ ,b՞!➂-&|~e6IvGOy''Em\bm%\KuT?G^D̖QjS \i㥠&g:ћ/^1Z cW|PT51\(P&&4DD݌`KHqV$0Vqs9@-s5\'t|@hDf$% # a.N#rTt<U)!J( ,ʥHY/ޒ > ACc/10eI"@No Dg{x%7N10Z۾jT畕G+3%?t%c2*0dw=]cϙB8GEYCG:l|o{`fkqzv^*:/K\ ~i/k!)&)x4D13[*JEb$bkZ}^ʀBК#^3%-8 4g j=k&֝a`^6vPYNJY|pQg_;|>!MľJx ?D!bO;,1Dzq)jFQwVwd4TC[nY:(Λ43KQl}"{og@1yklQ,2JB%/h)&k͙gR#jLJk^}}pRxicGh 3V3BD7AZPiR( {) m?2J n`9SB7S2P ᑚ@'"ckYsg/ H7] b}t<̇^?,ï ~~zd'"OFDl$s\^3q$-B_.=Pqr.qR&XH;g&EKfLz- UĶ&.Kq#`0NMAkVkBRlOQ'k+=y ʳ̉pv2 \3g~%WbO*UEK_* JzyΚ,e ޒh 3}"VΊZ2GJ \vyF;Ye7ǟoɄmMR 5},eֲ Qsګ'kj:Y:?6IT rbH"w8W22Rg'P:LL)okiJt菐PYX,4mN:^&TMu+#JoU1wJhH#;#AY K6B m\I'v}00c4rB׾w踹;ĉbY#̊{SeT8W3? ȽCS*9hb:/ q؍<@3G!u{ 6gJYQIE7ӿ1_ g@EMDBK!FO:uǓ⊑載vb? 켶߻6_KO}'ōϨ3J=/ dWu-e z]tU:^ -Ƣm%LJfRuݗ^ȉ9pj,=/V}ˣPz[*.?X.zzR@W'ΝV`)w\xǹ[:eXlj Noﶢ훩6 :󈖬UJ:ev:2ôtPgjri29C,^ ->Ɔ%ΡwY)8 %h)Jo 2V m,ؓ3BR*yQ罴isC6QF+}jӭ q>k ~ľM 1ZD=HNPh2ǫD{{+%_e띝]so6*[ջ*bu2GiMwWoSq=N:kPpHy1/|NQ"θ;WujJ$r%zYY}sHR.9Gh>z6ۗOvTX%o#+-߽@4ܼ>2/뵚׋[R0VRa3ZÀ Qe=xdRRvfGUQi{H8,E/?!#bpXĕ9\IIb-~i~}nG7(`HBHޥF@?8$gH)#vEIXsH(16*ebQ! P:EBl? s +w6f6>x7y%kJqn['w.%^:;UGs ۜ|^Nk"?SBj@u\7Hq}էd2D-¨>&:k-$e Y,N W"%SEvp6@ NRTV6&l|2}'_4[co,ռ4*h(u9#mKj:Q^~FzyR&b Kexr)#3Io# Y Jmha>g$])C>'C%I_w[Ysṿ"u/44;_TWW{z6_wP!7|L7_߻حbH}/-U/gݫ>;^彝i%ݹDhsյ/4coxKw\Z[\=wlқ2[O+얣uj7uo oZiܡ浒Couwsge̟nb7ZVe&{p7 㯊*62vw˼~/FN[2-Cls}ƙ=>-\F?'E3ce*H93@AH& ^;B˾5t҅ wkvr :E.뻅(O ~'Oy`!ԯpa{[y@?-Zfx㉭?PF:J I +P%l ]PS*K@Q.!f#C(.f[Hg+맔2f7DFB) 5g;D7S嵣m5N)ak\[ Ym:u7*o80(5E **8-PO-qL tVDd#e’R Aԓ@MOb*.0sRǜ@RZ ֖Ys[, x-\?>7};;V_ y\<NϾ\b7 $] f>)CPZU#8E6A''NaJie:#)eAeE%0$"{Po˗#F#9B-]:Q)! u+ڣ4j i#Fioa{Eh7^6EE0 1R >ul:ݲ“7A\L HC0ճePtK*4ϨuQ# G0jp!pԳ6˩.瀦$%8JSH*/R [ɞ"F+Rk(.y#Efr>GdLR32rwAJd}NJVn՜G_klz DQ#η뤶4k͛=$[_!,޺zJ M(º\Jp@AkbD8M(x vQX,9KDR! L NĿKu.`P"G_Kg-$9wlߏ} >E.3 Nq-ſjVv$i*s['Ǒ_/O= L~=,?n^tu#~'mr6a]]~ÐWzgm`??|0o[;yή'y'GM f 2?L_N%\6u5yںۛ~ʭuq?~dZ{̫ryduwJؙ/^ /_&_~DUp/?Nu}⦩gW rO?~LiCg0N|IQ/7^Gl]e3EoYi~`%EEn{gW5'=߯z bUA:AwFRȠKy.rwaO6p W_}vӳϟu|"qGKMoi2 )L _% w|]_C'iH=arђBʾNE*=,) /5:~}%3= wrUn>ӓdt5Uv??ZA;-_c̚-q3(q# 1%$S]1d bj+FI59E7G7ur 7's-_͙vP/<\5 [T%_fW3Vy>#.=2G`<0y#Bpur``xePl̡cw'!bکSF=iJmMO4wodٶ^zXX яVܽ%=eX %we:-ݔ:u_G篠fc< s-npڹlλep~qzs@XAe{KʸL k|1Lh!OQǩEx}O;md]тks9 -4.+5wF ' F\8y 2g̖9G QTD .1Z)׎ Xwo-0!cm66E=^Ctꨕy p4aN9`M̆4_&_?"}}CARF 1:B}L`:~?$@(/L_6EO%Z[r %Y ZHEQ PۃC5g \o:!_w5t`!uׇEM?_~i?3?KOw#PȞt~yq}iD-f|glSr9-ДDGa IeTE $vu6ShE}m"6]QF :q2&)JʙHX; V%>'%/6kYӯ~6=כGJJa ٫ڒO%,v*VᯨFH{Y.4YeHQuࢁhHEĈ rqP\bE@ڒ,Ift nOfHYYCB'Kr"Xi֜R)ǃ+3ـŨHEcvȘ0U#U|&bcZPlA M b+Ո%+ f-RL,}sP^=%r.r<J٭:V>WqaXDͤ^)<4 J^{-RI'iqY-p~i``e{ ,v5Ʊ3L8eɱKvvjt'm&)U.=MKg\>6u}w]J+8RBMh+[A9D@Q^'^+~}[j/*:l+!-uHX m=uV f5a*9-/밒MGm&I:̡] |(}.o" |wS7Żu|3<o[YV{ݑ3w|pc~>1}hձV?"hJп=:ҒΐV^fZ6 Kn`E-@~>tVO:>Q뛟0f EXlH@ Z# hR0£E(iiA1fnp 8M(rneZGEdQ0A[R.k%#RHDc٦p u(Gj}*cgp[̗kp'`  hN}xH-̿v%@\.ˌ"g^as;hŎMsoC 1堒D$IA.!B@cJZ|ځc)cF(0 Xۡ7R&TRfVD=[s-dSg|d'|4cI58=2KFBpLi F(e:J ^V{xy9"L 0&!Fc$ IÝB 0ko%XDyU>ҏ,G>n1e WbvGו޿|ϓlڢn]qg]YF.`ń+\օSb:c*96拂;v۶}scWZt2i鯾fE{^yس|Gބx*bǵsWvIMum.oCn{$JͭOS(?q`1X1ލb"/9v<򅖜N)bxЄkxAKaKl>{w(w:oYAg?ޙq(!Չ @ޝ31l(p$.UI -c1WF\(AKUY,CjX<#Vc}VGz8$*8ʝ7\ )s8P=x:F\x4‚S8+OK+hY&};c"MkU˶=;DK}慂+\r t9fT m*CA I1![ :RY67?Y~4WQ-<IH a5 R|ƀoLs(J̚ux1|wRgI呃JkV ͊\QOm `H'\FXZ0Kb:g նNyqb4Jb[Sk\q7TًKg:v0K?j}[ܛE>i0bQI9SY+냉KM-a$EV 1w]&6"W vƸ%@6#Dڥ$U||-hYz,Kp3"ا|LDX,-,xGI4 GrM=l2|ً<0wG*upNX ZhQhyC`3{tYW'f)Hc(iy]05+Itפ^%Rs=:twIf9U]*i0;a1-\dﶬu<3H[#3Ik*.3OʌQshXg쑏e1ocJ-Ȥ:<홣մ*4G_jjW91Io@0(j֟8tǼ 'l_vg8?,Au;BȅI`t8FPL s"ha~F/mo[`Qhc-my ll Nܵ)$׽i\ b9Q|0\JCgpw90*b`.R.Tym%KGtr٥ #I]̈Ց]V5rX!3 R8c`eH JpLSc dc"  7 KM)p01QFYJy>%ykYfm:{/"IIH4+@HR{}{lۇhܛn604=e)5;?NGYv lƬǠʥKA0I0 JL 9$0 ōgƆZNj$AIIFYF\ J3K`5CV)d0N'!*E=lȉzUo(,0&DNRSUO'!~,Ѵ#M) 7"8M" =HlA:jp`"aM΁l)/#bsb>0H9E!c:  ovi8sJ9 :[ &5&Y5 i0{gioJF z /ߚ]6T/f[*EiO_epmMTjj b2X(5X{?"`CL dCW a.lnb6 "tQݫ+a@Ui `M@Z,s@ uUxm =V˛43&R(_۟OS>}ޭT)(Kټ8<]Zu/M^0-Ѧ{ >3JFmLUy21Tŝ?ϯ Wp@h9[pGëR+Fx2~^f \Q!4#1}ۦaH0aV 0I? ULR|2= ={sx;<ǢqTGiԦgXJuB#i`ԥga0m\ <,N=NaoTMl8^5`;c凯_~Ï_b~_H? ̃K)9@7M`+^?CϯZƫ1P9Z=ƕ<#7}(UqKev"d J|c%("p_d1ĽĖUU?zKA2c%OArY058SA1PiBmG6{^QϷ38,w3^#PFMsI܅i9wC(v?rsVTh@*"SUУ/xnq LLh3*4[D'Ϋ2fYdY>"ݶBe/J-OV?r"\Dj$>{Y&&VQR`*ƌƁZ1b"iZ+IqqV#}\޾Y&}2%a??/ҙg4b jVQz'zR  l@;Wf3* @U6J!^ؠLan;cN44?:ڨQ0|dS. Ib #,-XrKb:g19bնNϙmqT*m%TI}k?[o6ewY/y6; ?;<|_} M4}w3,j\ڌ\i{k[fe/盾{/m 3Ԉ0,wϙ[8g̞1Oǘuϡ |uybLZ;扒SI+r$JwMU"57li_1L6xj QRi0;6M_yf5gOyHEVt0󆅙^w;օoŭ>TT$=,3cC[+Q~kUE>DE˭]?:(s`Lm2qb3Y0{uZ =-r&bKk.488了f顡L$QF$x 5U8HY ދݕUa OƬue"HxN`2}bVOw[!Ҷh_oS.KX.5 N1JnՊ́,,qN1m%*&y߻Lֽibb_[S?w~no:.^f7'QdNa?Oi KFp4|f |MoAA#IuH'+Q̮fy%i8jQPbNd)W*QjUϪ1n1,9-ΰt}>NQqxo4=y_7:%ῦSMkigvǴBÏcՇ~| _x+qB$,)9WM܃M`+>C; j M͇+ZۚO=#UW{})Uq[e-e |5oc,>8]R6ў܋дOZMଋvV>+x[br3^ȑ=c*$.'C g +I*E=ˌs&((ҍ1I,bAV mP%eZX`N1ѭ s״u, ~U-?#5_rh)y.-wi vvu.4pݡNC򧪶&O%L]*a=hgdNW/]{t ɀB;I~z|P)e%!Xm# 4o4gЧTRvyԟ6*ZU$3ˊvːpd70A~^dh^5W(;/ex' >d똗diBagI%Y9u\pEr2c*;CoHӴM/k!\AW "܂o;ibuK!x<=b]>`PUps" 44Rg/y#;6Fo*1)kAqiqO5.(6>k=44VXo LDGiOSR;TEvs,ffXj~Dz,6YIꃾ|>4@%4f2f BYQ`\pR ZqiGפ+zlN'9}mr .d(lN#B *#9t6s8hW99,=[ IlNjkN|"͒*럜:]}U i" ":rLmN܍ ~ y/H\vB&$ǠtZ vQ+&f(O_!DeCp[V"WT$e-P"^HU^zM^iF\&$DMe\98U`16LIDY).^~= HM"+P*R4bK 9gςZ;@[Dļ26 K%0)6%RdVXNi#*DL>,W{F&r,d@fPXC!@S#Jꦎ4"o׾kVaEƠ4@KI/AYf;zyg]`?=FdڴN N暴gGDOmQB+~ss^cx:Bgte@hJ^us+MwhkoC;'u!-u"g%l!aIR$夁z_Ӱ7v*|06AwDV wCU~n~~ W(s-ewDZP^Ӎ}?:?][Z%:}o6+w77a>4LI3N 7TPWX(/bДm d5 uku E1?;rC@E2A -ǐcsLZee*5KVVHPYV*l˒,R`wZ"CцAt6K} dʭ|>+7w\Q1}L>q@ xܾ vifOui@H1{J mw=:}VtvoIڸ{.ͩ/@h}ץOv|@Ӡ [m nnwO7th{?f!le3[w7zl=/\J|mhs]Ь R']Ke#kOx4?oCly?}ݐ:l*LaK[?nHcǩq:8R *: q̄Fʬ'kՊ+݉PotX/~ψӼAYU@s U ۹_[wt ~4ݥܿ[G0q<;<ά GݼY=oj N_j5i$~P=ؠ*^h+)fDA |(3 tĵa0If\SdIe Y)&/cP`:a+ Mg3g.Vܳ洽[kܳFtuk55vgYU*oN WJf`E`&$CVF$!Uɤ2*3@*59#M \IXҨ] NL,IQ,`+3 ڒtH<,2) jzwpKYxjr^,+JNoniW_qV'vO4~0}?#Kfs#MRp6']ڼ A[eyLmw, EQ0A(bA- #\$Dú,K)34wela4KHjRWڼ!cKR^ qI .B2$bbE5a:tUdHn2װQ06?kTc]+za9cHR]ֆC4%]&1RKW@6 i}D@YVtqZiga;`y%Y霕j:+Jp.DJn ^4ǟzkJyЋޒ* ԔX9*[b(!a)em` VbUD=_k[H ۋ!:.>wUQ/~yh;i֋7}z ?x'.?wBr)G!Ԗ?ۓl2)ᛤmQv| NwKF@01JFgDTy)-Za"ƇPβfG4M:3 KXhᕤIKX0.lʾd7\J)JKR_!ER)L_n4vNq+TN8S%&2.bNE\ ͧ)-sKo!F J=>XD\3BDNʠ*7F+BLfT^jIK9Y!}z7J[WK%)jl Qw⇳83D }S3C Rb ^}g3C1B'm "jPXFJgHt I38&aMP8'gIGm N),'Gsi U-# UJ+M9roU:::Liw٥}-egCJGb<9 /vGevzZ?WWY ּ5rWw5=0>(}Fz2cЧAxyC"ίȀ9=\gbTmAL&ʌdXGhk1<7;#Aߵw7 Ֆf6o}_"vIk8r%ϑadY7ߜ%PKH9 ӀfgY3),C.qT9* Y@g%&+AVsٞ6?10C۫]KMf-l= trf 75|hC5;L:|g̎1;<cXc#3œcgZHzWUBdUmR/Rasm,mKȸM]M'Iʭ;I_;3͐ߛ(Wt|fTx})4cq{KօkVNgp7?4,^g6 <&,]GT{k ΰM؍ysTxJ&MŲq^.^L;e҂Γ(N:/jVnCS-r΁PԸq8rxGcTQhM^9Og̩}JB2GW^P-7ZK J:_8+ω[`Ohݹdx.FR_6:z+Bz݅ڟ(^`Oя]H]D+Ǚ$%LhDBs=h ծpHCdP& j(OK@ OJ$\[2qpE6SsR~=Xj̾ǫC"! zE&]02 '+-&h""n%ua!-bg`Y%tx 7ZU>wݍר#xʷyvXpʋ4ERV1\`uP.V|UVܩR#``zJ3m#5k:M4$q/@MBs}'΋Omܚef -h.AFLDDH\HxL`NẀ`\JAE.VCY *Κ\d"gCc1.ED DphI Vtv9_3 }rNnfp~C{/o6C>6-3tz0Vhf/b:zJ2Rǫ3FLg+Z()<5*nqېa^;(c0Wq+zYXVY_ N(Dg%D!í3Hb1e[< ".qLg5^ED P Mva33ݬH̀Cqߔ~3~@/_T/k*>.0z޵|_cgVEg P3ў |5Fރ Ȏ) o;rq!Ž /7q`췖$Z@J8jaEg#1u'OWG\T)('y 1,зKw5>l-DcJY>O>-|27{r Nv. o8 6#X?M0uAt0+2mZ> UZv8x-iM2yMv*e棎nFHX vsػM,6.N;;ϳNQoT"M{Y8k$?8ӧS~㷟|//ӗ_Os)p)UXO 6fCs ; -uɻn|q'6Jr˸؇Q56f~rksǩgy-ꝟ5{`1oNWp7w{|E=ΊyZ j3ˑv|Po@%'Sމ{InN틍k7P I"&-IsB 1@t:楈^;VRBrD@sbWs0yU>?0gAOJ\ј!K҈>IOWks&nquSdS2g=3g1U6sS)l}Rr9ti NBTjKy)pE{0D0JP&sYReؔ9R^WP@8Rg-ךROmZ Θ ߮coR~jb/Q@oq؛` F qk)]Um~l=:6lA4y e~^.>ɾǼѡkMۖ9 v)ise!!-CZd9Iqe.a0 VP‚G`GE\XI<XX' t#EwV$UH$!YRHV!R0*nDsL$G h©$q.@N%eL*$ <#:j>1gJ*SdŦ *HωzKokquedߊ2;W9fhy'S ;!9P/\*m㈆(='ХRkJ@M 4IJ*KL (a$h8`e@Ւ~'I3*Ps\%=[G1!SeAAL%9U3RW%=PycH7w9otfzֵ]v>γ=/L9it{O!<.4;LnP+5Űr\Isvh/Q!W_uj6ҡN`uilsFA e6aj2A WḂ߅0!UIVVWG-w;Bɾ5?K}vQ8ln+d09Kʸd7& ?c?C,vF7wv:D5: \ "b:KIWX'J|d H"0Qy[:*)Q$]y#8ŸZNѾthՖ8ZuRtY0eȞﳜ^7YT*oO #2g8Q( A9# 9$%FE%*#ctH2=j'*$c)U1H*TM(JKb.7b,\'/OizqG/qrڠAUw,C 0K X/)\Nh#Q y4983ec)5Oː?1βRy b& ZBMb`"|L+]:tKlw; HbԱԦ6v`7l#vO=3*ObDFW&{Ȳ@#鳢0 wZP@ != eM9IٖXYkт IQŪMg%L'P#XR.MVcÇ;n9I\Lߟ{hnA7ܢ.}1H@FWg>]3e_n_|}^"yC 5t_ſ(x__8Ym>Z0m3fQ DtӏolS=?;81bjqj^z%$֗U~x6. WGm\y]޳F9^ڮ/g>Rq) sN h9XaRo׍+ޘ|Ҙ` V0nnVZ{Bz=|ִxN.AA^]ЛKWǂ#;o׎y6.\ŻgON7{_$m8.ُs)_..siꗃqsVn뜓kwyǽMS_RQq9e7<[x((<(Hs9ȱz%gj3K.gZL %~ 9%yMw# qNˑ ae0v6O,ƛttt!isxTA+HggUm㏧i DxR!7Κ}y:;> ɵ6w/G{:4Y}spw߷yuZ9oggmWbM!C0S浂ü_V秇]R?9N?q~~xÌ9wiwgvfΒ3G4+J"mESb VP)9m^zf xa}R ^Kk ޻m]dћ[(?FxM'_v'?Gkθ{ Mch/spx8ZH_3{eZAWOhI:\N?dFﵓFӮ.oɖj[Iǩ}voۃ/G_<ƔcIoNZe^lZH/F6g0nɪ8i-Q!Ű.cnfGt<|8yy.[o5]̔ ,}zU9d(P`o[jv9*7qw!iWt=w=zGn*_V)'o_mAਸ਼_/̥LO(uw5R ={M^ؚdI%(ҕP&׆Qj"4j^ u*֪Tra>\MtVMݫRyctlƢ8ОuV Pcǂ:V,ߎI]efj͉!fZ v o]l~#GTS;sX;+j:7f()n"7M)cĜkY{t 1&34sq}cs1׼> B%ލמ`=^lJ, o2ޡL2:玡BQi=&xB. Y6ރ%v;qQ~B=BP: & aȠ>ΎSi7hSآ8cgDɘ3ş> Ԝ&1NTRY#ΩTRmm ; ף(]'p_c1Kѻ)4|',"8:AOkc0m€Ҕ%yNc}Oi;k^,HU.C2+הOtOV)YDw"R4VAfYt7H!QS( ٥fݑ#"L#mTȗBi픱EBXYS6V1ؽؠt@<0O4u/uh;g۴l(*gu%Jr _e]]/ 2hEgK'ǚژ[]J2q-Ygs#)sx19'- Mw5TMX&E  ʄW's$4 ( r HM餻P+*өPm5;IH&GI;-V(!5Nd=ಮ ++q2FN[( qb"Y' sG !.o0P#_J Q'T jXƯrPfdcBQ AP{S e*3RPHqHqƢ+xgg8kVcWݳ.׭Tƶqf`yVoAB)Srj]&%"qPBY3 Y"1Ш۽B6ф=r+ZQCk>84iZȠ s .`X3RUNJYg 7*ƄU(6?IkB /rB060æ)t|06^n]-m/ mzZIYi%@"TYǪd 6Rt%Ch2BCT΁dx jwX-DU3rBqCoվgb/o;lE^1H"N;RkB $\sa~Fʟu7/V1Z8P 1,*1"1QB;L z!Ai,:W1i)0Q`3j hNhc;)YѴRO 5֬Y 6jP%h3u"1lgj)օ'k=Fw *Mf,>ƃ7TcUlѫQi#Z[U>:ƥ &FX&[QZpZk䋞P!V&3 Mtd*AيĨq %׆.I2b~"PqX!gSMXĪi\0 J.ȎYyY 59H˂"Ɉ!6M.Xv%a$di_PsU~:OW,<<"B.8 R VQ_]ͫٵ#JP@3_F%u[zĄ:W['۾{w @ϵ,a`ԗm\#@%.nX<;/H.@0@0ΐ@gOCٌ@/($@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@/c9eH ?vI EB (:@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@BX(t~Happ;C s# k[; 44r'K$QV@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B\` EvNa;Cr>{V I VB $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B rHOZwGchz{}~7Z=_  v \9p F3 ޸g.J .i%z#Al蛃2ݼ_yqՁVǫW{l\5Љ˨|^uLNiHЂcn}2n wz3[35{0: )94iZϵ|vй0>5 y3s" ڀmP P&%*ts360y7~zrP_JٗGۭyac3.k Sa!^ʻ(l݊Wy&p%eI^.&b ʀfyȀ}*y 阬6$wMnD5Tc9uͼN,YI>6BX 'r,GE5XQlg4ۑ_m[-FzS _z=ݺ P{;\_[*}^9|+sED2+L΁mY_ υs}nTЪS+cqn \Q/|=xW623鞭lļ3t(^ C,q'5jl Rnm{¼k]P!ɴs~/7b0$ mn0!f %iK#VrI=֛~Sse,s{Ͻ<} b 2BkՂLD'5sFk75&cf4YL(qr'o:抭3:*Ĵ)K7V*ehRe$AEF0@h"MN MMVu7HFu79ZϙX_绯DEz2mx8z4ۢGZL/6Ks|+NG A zJmw*É.&f2rp6rsZdmSS!Z'R! T@iblОK<1GBRJA5NHlUc:2;@r g?ӫ0\dWoV]s̯.!熱ʳ06mON `ke#Ul<34wy;15H輶BASB:rWOবvueJ=9+#b$ec+ouu9SEqSpr#fjːL)qfhǨ%Hh;G܌h;9;^ n%wDpW㛉 ]FqᇓݮqQݧGۻN|4xZVow "ČN Xf,qD)˂0<)# ͤ y;vּxˏJ?~?KNYuoB$< Q)A"'Ҏ[gEҜMo EoltniF2:Yi0-HG4PB;)sp􌔗j&WwSy $FqL- P (`\޶K Q p(.J!M*goe~LTָdը\j0onoJ5G =?oU&j]e67{y5x SzxaڳRrePup+E"l_)y 2E ?Eή0ug•ݝUSWgao%P0J0b/+XRl.\wnC6?:Zc~ }tnpmj!R__fjЫW xQADsyr:s;!g?/"p_e r/[-U4FG+Id %OArY058Sa;bRM*eol{vkL5|9!\"<hn=/u*|&U"݅iJu@oPTR ӛGnQ=АmOfdfxp[LacvNuQt]7亴WK_R-%FȝXxyZ¾gIgTB J-5gEcbdLVH ,9k7['w^;({c10P}wغ2ކA.B9Bt@~ ߆E |Aw*;M}]aϮ@wǒp u <<yywF> 6_K`yo[|-+9=:s i͓ iUf_dX^r+(@,dd*PIɳLk9RIX9|XP@^Bk Q*m4*(aG4QҠbSI(b fRBaF)mxQμ7[WTy 3Nܗh/Jf`}},@_VM&t7JDuo<,PzΤ8X"sS)O5%DLMe"0=KDrD )/5(82fD@ŊJ&TVuHzoM/O 9`[/20b b*Fꕂknt[eQ(Tu28j{xK˓D0 "`-28MBH@;Aa4K&oi AHRVa S띱Vcƀ qDk429;p6(6^尰LRR[(|E&:낤[>n-]]ujK>ۙȍU)O[W\6wzT0^Zкj]47wYV[:ْbh7f!,eݻݴz^4y[>gZn)itwseϏ˸[pâL-9L{+{p1ߚϚQ\n6zrSՖߍtS|/inn~$ݜq& *nsB^LqD./]S/nT/n ÜΞؔh|9̨ƳTi #'F}cQF z Jp /{ a9 _)|Jpulrn?EbY{Qc @B:@9t:,_^}%OZy/Ɣ*8ˣ%Q !1~!"HB"0$ XSD|0arroDeFv 'A0o?\}ΕL+/w^xV=cs)X$1"Q$L1/-0pZI @ dB}!| .a>j 93Tnɘq3J9,̶3˱S9d/,QܔOoҵz;^UwkL_AaF= D ß@y8.A\iA$  K1s*[`gsZo{e1eA^_ ȇ!X`T9F5 be}0z)#"b1h#2&"-':T3ۆlujw)@).mFikWM%s۾md9YM7,YydcOP F!k/%%X .XZYh6tw 抜)ߖ]GgOZb}YfRz6gx[L?_/+q#1fǘc%ͺ:=0cuLL2D-IG::Rs]O 4LN2.taAQEwRMRnIv\b{*/n`EgbaJ؅{&>>毸WPT'Ţޡٰ\bt5=~f LR|aVtʳΠߌyWJ9&j?sbq\Xu DDtҡzgh7[lP^*1o&I+vNw W!8i+r1rER+]:3T+%º&xLf .67Wm)!mo\~K~FzYj$KBzݹ՞r,:'n:KԱM'jm{t.nM3ΤG$R4G#[!Q)T'^ӹG3R.z+uއ94Y! Na=E7z|Sz44C^`Te R|ϛ+)O"Ӊ\EL'jlNT*щW( 쨔` s|4*++V`vqWP\IɳU\T!rE"*Q9:qzĕvD>eUhR&jEw+-wW@&hU"tL*QZ/du7/G\ =?AD+~o< 6}W]+д)_vԡʓϏp"n98s;Vygpt(^ C,q'5jl Rfl;¼+ )$PϦ0b^0X_X`.Txv[7 qpFq_|i}ohX КBbag`hMFuwUi@ҀlsCђ_CL[ΜycR4(# B, ύ6Na88E8 MV[u#Sq&V|[Rf|^i$G~ODxtTizN{'fP!8NtAr) ڢR*)ss,hr҉A;R^%{ <3{9llR-Ґ%E3Ɲ1z}#1;^2ZbJ`ŗD:@VcT8ֈ #. Z H/1D^!;T^]]aF}8玲ѮGنԺ9SEqSpr#fjːLdhǨ%HMu+¥& S&a`4cZR(k R5,3FΆ˳}"Mv^3Z%JV\tgU<-8iˢ_ .7&w lƒgHJ .`&Œ+L(qB3Dzs{(ȵffWF &MDIFɜ0(QiK`} 2?ܳ7"H&W.] V$􇀷:9K Swb2\I9xy?V`WE-65igC'W3r[}3~M;( huŗ62>VYN/]TBjQ=wWuz2^UPi|COF`i&Or6 p|0j/TM'#Lu|kyƇy`͈48蟝O֖37I>- 1FRq$i8 Gq ևᨒOi)> ]ޏٿWLc8*AGNiԦ ,' ըeSIs>]MzKbhrtZ'jTcY8W>s_+&뇿pV` 'M$P3 ډ9؞C-ֹCS6Z|q'nsMaC@['QC_y7U SsrU=? l~9>.*Le~5ܥD"_EL9#}MtM]1"خ%nmxȋPxD :c YYc<8e9hI#("M٭ SWTEDhyOJ?5aฉJx]h FpQhxJ>')9xz0bK,:mO<'y~mֻ.7ʍKKNK0KXYZ p}uwIg&2L J-5gEcbd%^s,9nٷ⟓/C(v;ZlJa^iSQ Y@SKBs%%cmjޘ,` Lc(crp?nt3𕌷(`޷0썡]: C@Nzo"WR|7=ɳcx1}6 Dcvn;K~3.4哐W^w`p"5&mMv!öCZBZ@Hk۫5N;,5 y%7mBHfE4#NVN(ag^̛=Wy 3^>_ӅD49aBIdrHdHd PG;EWa LC(]>.iL&bA%:H.ùD!G4 Z|ځc)cFt(0  ],H4Eʣxk]2gH>" ðXp n SpZ E*NZGm/_)Tl=$axd$nhb4F4)$ 8X6yLG eZ3E佖&Z)PdDdЇs"_D]<?4)zbUG5DZ|g@ݕpY$]]@p}̮>7zt5Uk\\IZU{ŕI@huש U 5[uŤa9NgmիYu- hYԻunW-NrAs-W㾹駫ɞ f z'_]q}d~f"b5ӟ64m>[՚PU$:@L7{CUeߤzNX*䪣L ykr8Y7v^(2'G@͝N-Z`P>RLzKuEڂ%bdF/cLʍYY^0flr`-> oJ|kW< Jsy<1b{"s_@M6K:N]:]rG7o"t!\w L1vN}~|v;i%ޭT|`dQ,gиj¿'8@*sVu^7app h|m0$¹y\z?p Qw[^$ 'E64h7iY>T}^є~Yscjtq|śz/M]H vrdIXncQۋT!k&MI%yj["򦔂2)e1- &X5 is,Ezg:;d+e)Zy-yLH= ,;j|ɬCQ{0rHRjㅑ|?4vCcib ̞y6qֈBj \./݁V/l@^`("CV;&G*ęsR5U2U]%tPRHmFU1N=cI \ʸY:'U 4dEBﶺ:[Lq$\@KeKR2K@^5S*!dGs5 DgRnlWZ1Ic$,"PI{Rj% 59B+J6v@6ue~^ta[:qگzAW+c=]Qf/A:|*49_7%GrXTbҥVXTY{XG_7.O!' IUp;JO.8P=l:Fy4‚Sѷ%{qϠZ\糺sҽ{v< y.3/?*uMMW[coޞD=n|31a2(%Zv"_jY"M-{L֥Ux][VY[w&x@oGVwTR}Q*8ŸO .*{=I@U` [eUukB)0pϼXIJo־Up`jpj0V+u&Uoc\(8F!څY*x(M[ _CuǮD(xC]]W“C// Nf/+_NY+ ѪW0„C_OoQyBD3A|q_aPW'Ż::L-P6Oi(4cĠւwx O/0}.2D?U&O).<ȔEЉ/4#=~`t$\gɕ ~ uJ`RGTRJal0>imw1Y.x 6L Q))[EKo2(O4J5^{Yn)c(-[kg[.1+1Y^E'WzM1E-bP񳊯+7ǰB\= X zi+&mͦ +Uj3-O`7id)|~ yV&( H r N~ ßUj2_|LJy{dOޖYMV̉[kSAgPNdN6ˊn'dm8C5T(צV7zqtD\:{ڮ ,A_eI˪<,k m텟i!>$?7rݦ}w+'rJmUL,0Z?`ŜL<˕9G5J+ !wi/KbG2 }+#Ulޡba`c 1\..s{t@ɕeNF ͗aL*a`6Sd eJ@@sy6 s̩Bi֭b*lƥIl mT0{/nQXjg"מ@V:ozٝIY՛gv0,;! rT3AWsXrJl ʦSw߮mrYi9V0˅cbKQ[N?+mV]݊ dH' \ĖFXZ2[.1-Qy}7p's$v(ҟ]/_ 5.~Wكfk-@'/yyU]'osaR/1")V2""&ZH0<)c"ҏޜ(xu\^P ]G-$}(鈗.fI{RiEWs5),k,Yydc>0E4^4`JnXb΂wDðq$ݤ 5Y{Nk3io՚ ,=nAdZp$cX,-,iTvA{砵ñˊ92[l s9YNlXQ\ux{IاzI&tCyh:%p{3&BWD?^n+ݽz`h>xna;q9zvK;NJ5ovQI&[ S&znzݴ8]Vɝ:dU 3HT(hͳQዯG8z$&uiyjOvvC(ꇟ5g>V9/F/ lQH[oA g͇M|vxm,z7U[mh&wxiJqLpVh*ٱH0>tJR +4 fH \%q:J2rp-zp)rSkŧz *M͔[9u  8/gh*!֗zMCs/1.*09# .;as!9“7ox7Ë˪Rb`~/n8M ' J@u~?htS5,{w٤H9'LVՙ:w]y] ]\__40?S,Yhf`Q1kZxdJZ +%iG!$)Զpup%zqt}գ\=N\J W '8(-\mrĔ儔Fp59sTGCJ9Ła`ws]ZgʽF ނ& 3Bs 3lS{~>rGp=1N^82>5Uu}Kc3fXHc84i`>4ȟ;"^c63W\Du ŶIJ!H輶?B; l$N4[Tծj'ep}A<ƶy:*Œ 31 V[`:x>X@F;F-A:orb$\.%6!`pe4K5X1,fٿ> 4p`hvo.J(tdBXR-jlE=2x|86Zi7û9v=56w_>2zcӣ"r-ddk(bC{-8xlB48e-K!?|}',[JnO%o5Rq#fN@W+rå#]-"=qJ/>¨itgM֣yޥ_[Wݿ~⍏d4_do'>t횓h4=|U#js'8bk$#)y]Èa4a։bڄӳV>],Zl:Ynvin:GlEvڵV;꬜$Gogmƶ򨈍c~GgNS[:5Dw6?)H}_ן>/_~4Οh.1R8ꚂtO¿45x ??bhWkho9jz{;LWk>`AFj֘ǕmHO~a:RfY=u+}Zs~t(-B&6J!]YInw#~MVrmɶ.p1$[/Gw*ƺ #IcƳ9g&HSP|h4YIȑ  il{qoR5-#J"LW1R7LD40f CpS$x!yPʡN*MV+M [Ҥ)>llq6vo%llb^%v3#JyՊ_P7\_[τY+<4 4ALX N [\>(?yeʌ !y@*zȇKgu׊b 1IJ 8*!4)NGԮo$AGo&R{ܥϟͿ_gߦLe<JHi'׳ R Ĩ 59(Ir N% RUYEa5 :0ϳws:i;RApn51eHчR4Rr qNc0eyv.EXM{Rt܄_͖O#{t MH3įmw> ^]/]_}Үun⳽'ujRKٜw]z#`n/0nuҺ=mtUBanRˆZ6 jݿmƻ;_]ZPut{R5CgӼ_Wwt(u/yzKi.lo+ ECO|9(7m5m_6/ў6in2I 1سFe] q̆Flo' Fsp"Z؝Q; &;1,ȴAlM >D)VMɝ,'ST+2p =MݧH!= :֗OJ%u\JL3+6S16{a0e&rUFEl9&6" ~&aB?%"*X!PE 2tv3*\F RwEZtr`/۰7XU*N,Ze(QfN2˔N8*$ :0*hI TlN)H @ʒ =K\ ĉ g-&Ík3 ڒtvKzX,FA) \^dܑ f,^Nfݛ?od_Xb k/JFE)K5 <3Ar! W7 dV I0ElJcQ.Z". P9#f-C5;MӚ]:ڼVX[!be Ȋ!HcBҫ8YQYDDD$xi 5|4PsHߣc1hIP3FH1j%Z%r3!XV"HPVzKDXAQeR|ըP(+E9A.n.9tौJ.:}`YRˀ$lKX2 ʪA.>\\so>qpeB p3^ZeRQ5\#w,P4J8UzH>Ji p2,wΆoJ~o7VC+Y; R蘝Mp 7[xtP*ds2rAr"sM%P0`wU~!aʻupGvQ꘼3zQTByI,@9ƍ΁9 8N݇WQÛTNK?;~OK4͝۶{%3]Oӯ>kzm* :HYAmQVƛ 3%|S s]<xG]']h{Fb-^wgz,SU_{ 90ǘ,C;k`f7wchib }W STI=Jֆ{¢O! O/=>XT\= |E$ےvt!ľPG;m]*3tIm֙y3Yxyuޗh/] z*¦˪^۾g)C+[%X$;F!huF iFr<@M@M0J+HKCSnSE2D@Oʱ535s2xgFP61 s0ػFv+MDgUlf1ӽ mue%۷:A=\C(+mU2<<@8)`H.)A9DBVh#Nl8Gm I[-JwNZu1-uC~OVv[Y8Nlvt"vW.ofuL8B'%Bg"`; ԥT' O6aD@8JɎ򙵧D.Ћ[U)TD;zCI^x@Dख़4fP#&f"(Ƥ š0Y, \TJ*gХuH9)s%ZN7Sė?? ,D,Mp>>lڵNgR[].)ݷO-K`إeyskZcgwoDzxs4/˃_]Ɨ'GlDaH? q6>X5ZK&-!B۬i e^D%tbXUe ̖=|( A%SYӭٍzk.f_P8ecԖj vKDt*85& {IH)3T,g824FfFć [̥>:͒CqQ7E=​[3kkCR1 @ 9Q1 )t》žaq(Bc<0C<Tiےe?2m<(Z<ن׻EAd|;wW;teK+ $z$lEE$߳ uQ5%OZ18L4UE  j>lf4dהgQF" ]Q"SAf 0#3_$0+kK;/ЍQMlpNXY E\j$P}l` ydFw*;(koڍmZHi5az`h~x++4y(H/(_u|24ĠZ0uB:uBU9X0YW67N<<Psx4z]&x9Dz7?[wvkW:6ׂxgOK^{pq~~szA;S>.h2?^4m4z|(ȿl{Մ ڢ{n'0xnC%?o%&8ֈuYhܽl+ӻ̾1kƗ^g^M?۾dzy6% ffA o,SD_O_|bt]]7$N@( H2#df7ʼaL]Q: t{  |s\ Γ(58 g'y6L>?ٰD@Z#(CzL+:,f/-#;o@ks-ڈ# ,BL LLVɑ8է1X>>ϊd)k1gkRD t HK zqK.8630 Φ6IO~x'Y.-^6Yl,o~zzhD|Qj"*Z}Vkmr_FTxU:*!;cg-EC v!z%K TG+ZAEfg־K;!ED+)JIdw,$`-&mт'6gws@$*C+w1We~{ݲJ|N[ogDΜֆ~&u?g?m2;2ɐS6q@z.,Pj8n { WdQ&HayP4Z. Ƞp"Bc^̜=nS?RlZ˴orl5n3fXS+mಗ`TZeyr_ ϕ{l3dT ؖ09xWHR@V.r/{z\(T9Mh۩2zԋu韼Q Q^#ȖثO^f7y-ċH(IZhAk7 rP(wl䲮Hj҉ sezμ B:$Z4*l ~4g#uT"e|sTd"xhzTtKJ1_ըƣ0~7Fz7aҳ\Ie\ekoij1G] 򝼢2UGMփA i 5{/9Jwc\D޵qs2ȗb`Im"n/ XR*yk/ht=mdFPCy<d䐍G 1ErvcNX }ʐr$yo2E8S[" U _V.p- zq^Xαp\:#(g96G.PYL{yUJ*QOA/kqJ.bV+snU ?jtO,Mǿp8#0*kmKy?V[ufwU7Φ'U`vU,tt2[n}IT Gg*!EkAHSM6$5 aHbY,G8iC*={1':tC$U2y#jmjZE|s$W,q0C~tB-=ڣhv.+E ﳇp7wo)3LJ:<>zw+w&}$ʬWmUͫFܢjeluᷨw&7{e0N"}X,c쟃axӟ2B|\53z0 |0ܟP?RqaP>惘B4*!擸5!I37I)GA䜐y܈!"3P2YRpb׺l{݆+sC; VW@ޢLwʡNY[6&imtVG;;oT(7/]jt_WuKOcfk:^޵W˯㋿} L1I#!ʔA9si_2tdOÊD0R# &)_APB3No8ux[V/f- yi,d  wz60 ym}q$Wt\.e`LxgU"Dʥg*.gFn݀+/$c-PH< T`l)A :  )nS:Q{ؖ(dGJ1FAj#s:$ 1'aVPS1Z8)^vYĕ8x!j6gϫTe`ʱTutPm/Ws.bDAROx;ykףXGC %! &߾1Zl wqR}B)m,R~6rZ(D*3GJFF)6:ܧJL%:Az-sA ~G.Id.rBmP)sJ [7H1iTȜ7eFݚ8`ԗ 9]OqҐ %[c2^ p>Ln>4n֎[n\;4$ViR֫[C&$%seftRI-sRdgwxޝ(_M,H?>؆1HèdRṕ"3H",edXwƊ0V.a`~e96~mǁ/y<:ᝐ$RIP2EM +gAȣ4V4B0C@1@PTs` kPVS6@T':P;e\k/vci&K3R6rwms~[\vs3!.u NuܳFj(nq 3ަ!%3OOBzUOy^1A$$ϔ'TyCUH#!%RQ.1\[DZa;ӊpiVM$;t{bw@6AzB5ŒdhsIT !6BbY""ݬood2d] s MB f`;^jC \QNqN!08-QKLpj0)єH*}! wj-5~nY .8Le;%]|2Pmp)φK)ڗiS1'[^ W 6HT/}cg^K`cę,P]p"_p7R7 2v~b̆Tԝoh8 Gy WH X3/ m"5u|bq"#\ (YNC\@`] $% t;d/\ͽ(ųۉSQT|@K``:hkv p#^+J/@HʝJظgBUC6:sozfs2e۷v>Kϧg.[虻̭9fqÄH3jK[/VF(hr*, пL <~o#;RNF=D8D.X*QP! eBIQ30">O=Yo31Q'1*X*"|iq2hbب-d$r>n?tlNkI NE,h1[ԐӥHy>zCXj9xזn"; (#JYb@RIL bazQf|`q(_ ʤ6ngy;w5X|qk1N?g?|°pa}{oMTp^LWWcBj~>*N0z?+n ;ʨ?gPy$cqH潯I(z SnMO 05Y3pdַ.veϲ~:8|-iƚAS5ILc5'mQs/U|4<=wdYgПnhVF6:odSMmUTku|%Fcȏ>b6hٹğ?<*nr;TGo߼{=:8=:8aL#kÅ pߡjVX߼jn-VV~zl ^> IբފaVMB|~Z9 QEi>XFT?\ J1LzTP Axߴ'6E(mgd[S7y<_oZܜl Ů% v1O#u~3ppkD=#9T/].Fk-0e-[K&gdp0l]V w/a؃XS>#z)x. aiFH!HgkʒA )IJ]:] 9k@7&noE^C`!lzR'M A__ǧ=LA"2A.R&xA24B2⊮+:E ,﬊Rȅ3JpQd՚1 ;T ky&3Zx4&Žg;opaIcNQ1 cO,ufYrf;>߫ B2vԁxRHyhʖR W&*(v(M!T;D!c[e 2 *$L@im|B) DD69f#Bfn" f ueg6aTHQϽ|4%QȖ-v`+r?OԹU5& *T0=kq(bᄌL5SVxnX6gf%/))=Lr6T[.E~nk^6 cW mMSELW_HpPOlv%d>ozրL, L,J&KBhwfv0oȨvLW P/ҼggPy&҈-\۟<;Ң'nފ^̵}{^3% ~&pJet1{O `b*:f'hrNR)\j*qac-@.5\\s5Geesvur YX 7MnFwzfƎ(4ŸeI{2F=V3+# (ϩ&3(csób@cEF !mr@3A-{IXC$h0c7g7c8L̶vcP֦ 6YWlеEJ6)fu"8FRFxB)r,vIXȠ +:k"`R,1gY `A,#`5&iҨ_KAblMT8YÌzFZi(o.( *@ex,)h" .Eb ox1`')8YiZ<F$<fBsA0=bn}JPxq\j\gcP^ "y=`Yo(XD,.C (H(`IT$t]ϋOmZǡ|(Cq>5݉eW\BޏVRټc5֩In9^Tyu ^L+4X 2F(^G%jA> k Ss$-r$ 9 |mw=Jq:y-r4sVGAE\Dyə$1pA7d$5*XHI) 4@qzb,RdlMcr25m^M͠%2I&Ҧ %+HV$eˍDs6f'N7 eI t,yÂ< ჆>!L4P8w  WY)8{BHB"NjK0gWc) xfRt0R }NaЮ>vYSYO0sf"CCpũp&++%('1"C-SIZǦJRIov+_=jk8ɠ~:e0=YJ8tP0[% R\@KߢeMrs_gUw.0/10!t!|Sf\dt]`>.GEsކKa.>AnZ)v87z4ᣱr*܃Q୍c!TĜ7~&n,R[~]ޮ?:qGx qp)Wp70"T̞wC!,U}U7??'7-K>u;~?ՅoN,lh4mxw%V&ߪzۑ6ndǻ;J녔C{2ʊ76+Hl4kh,?5 NV; '|U_u|Ҕߍ-eV`D .Ӡ%H,b8LfO,3d RRƹt["RzYPEVBHZBfk˴҄禳7gmjSwn?A˼#gPޟ:; g|7R8?祤v=ɴk /G3SL#@F Czr KޝGfbL_\O7T@+#̧8?ch3!ǡj-"[Be?+ǵ 8+=R]TBtrFC[..]ۯLj8?U!rέ'KnX.?Tʻߕ6zҹweE-*_j>* Aw _4@{Z:Zd>]ANϳ?z5[jq<2ju  *Wh@ S,'ڧ*x5eʥ6y:FL Ҍr0QqLLB8CdhD>~VU=!14ufM[/dK!^b'%@"jz9MШ% %gdL8ًQ4eYw].ǢcqNśQǜ* z!I$P純S xhנ0`,D.KoAbI¼NTʫnCk3ZZ!: dYQ}:gңP%F{Ռ/ c=OFs yy[`b|[ޢ*oz/# 'R`7a;Op5?"$$Ob0Lve3tp ]!ڧ& Qի+gՃ2t8OPB_[Rvѕ~]鞮z ]7Kt 9eWG85ph;Yu߻Ρtg"Ti!Z'5+ h i="ʺ{ϟ'ßLHH +BZwm+@) C[08vpۮUt(JHbh&T+LW vBp =]])l̨I`{є{bM͓"p8[y01 {oӏ߼L'WեJsG .8;RcAVVw`O!F;4pughZv&-z>!֜A1J^&efn䷕cpct͓8CAU %78Q>mG}/^K۶].!~-ȇrB+a1 r#\f0<-6,;±tэ_i_?6@>߾+Q5+kI}>In݃}UZH%+pYl;|QygkIn1\R>u1]@,vr;qQ+Mgpm\#Keh#"1K;?em7%)bZZAx ?Cئ vgJzBw8E2q%u ]!\uh i;]!J۫S+U]ZJ_zxW ъͯtu2{V9'p^X\!^V18eteAWCjJ)]!`;CWt(i.8]`Ilg 2Be-6IOWCW\+I 3:CWW5+@ JhۆՏUStp ]!ZCNW ҕ` T#SQ AέInT|gɛci K#\ݙ);DkLYQZӳ 2Q]!\ՙ);@䱅+DzQytVt3tp% ]!ZzQN%viAKBt)DyOW'HWr+ka3tpygAD+u Q;Еݳ9c=,2.tN0wUV88gon܌_L"I$)fpBPRSW6Φ(Mߒsx+umc0  /Z*\&|etqǃE=$d eS˜9 M>q1zn c]Z)\H1eQm;Fʂ8دޜXq*C(7gߺ+.c `m?@Q.):P7Gm\_B4Zn_F\).ZeU!2BgQSEi#|Z>IͭKχq]@ %E "dZ8mC>1XV?ЫqzGh~R1Ymyηm?C{VF72b7HǾ)`>.GE[v*q Wwf\-mRKeu__ %!WlAN5~^tXL.޾0B%9M*8 ˝*l3b Kb袱) nNeq-Ik1.KäԥD# qs.I;H|ԖhI/ KU>T,>vͺjxW f/ϐeAylx w}`q_?+דK' WϹ-00BWQg(".W іݏ^j-ղD7wȟ}1J9!|& LHcY ]q:*0vR$v9$Z[Eg`R3\Fdl6sUbZKi4 ‰l RzDrҿٻ޶rW@Gex3x7vf rHjdkNãe[7۴$g/CCyáh-vjAw_}l'MܖKvV}y΋&eke$P@&r?^r3)+a]"6vE ZQ$襤uCND%H+ adPMZ%Q,爴tTWZO#;a\ bϴv̢2ńದvIfz+),fm BÅGoϊC4xPeN>~HZ8Ŝ:'}ܫ5nLJeN]èwּӼtoVwUWӓW]a!jpd6]튓7~(F|5M;`lSKol N75#66dyCJh'2eW\98LtNnj_۪`[]trS1n$yZHiXMt1IqqWzV;R9Yeqg'4DW/?{w\w7?O4ӄ(7 ݣEQh^t]EMk[ߣ]jvyCwGHv*o{?]|z5LpL_ ^Zw'=?WMbă^\|&5?]yT-w=x[b۲3YڑninX&XÑ $?)I$ОeF9 0ΒD >* XRHJ4s=aaV}>ai.eyDkB5Zշٖ&vqzM-ܟ-7(hki$۵SD- v:U6u&izq{pLh6'n8nf[;>{/lڙi]k;2{:MB[v_BEBWgq}1A&wܛG ²>p2]/jU<)k:PԲOk_~ $;&pZ?W{sǫ( @_<"zIzt4mKai|fmͱ}L6LK`i['.3VUOJs`I=]TWϙ+9k[]C~\=TLYL<4iq&H͙0[r*n#锄9 x{H5|oX=+Q}HC{ԭ {t֍#C 3-BE ћcp܇d[ 2p'3|P긁,5&ey,h$zJ nW@|B{Y]5qv: {otЃLdwɼ<4γj}w $!-Z9&NJѪLć;lk$20\WH}X=ܐ,0i41`ߠPJ#LAtц vy蔷נiGj1o7_֢ vqp!VoN=1׈ѭϣOQ9TvN$!r 479icVu i[ <#tXKR1pA !`HXaADGiMS(I_? AFAZA,Cʒϒب3N "y+idsc5qܪ>"I+r.nm~FF}1=YA sy>1@%4f2fZKcQiƅJH7VtM\3 u :r;YYk}{ B}\yel@a H^S"mXHYaM>;p9|^jo=w) Td&Lj۬}.K,RS!VLsYrE~_}~NA[A;܁9g#N>g hOSt'P eH)CSq7DsSN\Rs6sss`k{#v"-{$ 5~w?nOa D{"t#垉qϳ D 2"`ItT&& 1}3ȅåY g鷽r9I+ob[RҥzOҨ7reH+BLm~w4ȭ^7ٻhq!N}]~9>Է`QAn!y-k@_+mvB[YhoAvH념'CZ˳gia.#uRbE< +a$BS-6,ϒ, # &֨ㆀeZ! !Xe.ˬ T kL.Y|2C*Ņʣt>E &2p!k!Yk1 /6Lg󫉳qu R{wjcv<[$LhO@g;9<9%/y!c'&;, /!pY8o" jվ =̖(3Bw<;aߝ?.jTDIKF&FZ)1sm2xSpH'H;XnnX}AèmҔύܝmx#44|!cbHf4px9/A`cαdҍ`ʝ$(Ac.oJ7T)Jos4Qv2VH%*gn;5_]=j^)}>L[<z}x n2bSJ4[ z%v95_7wC)67<~;~^)f֡͘:n2\0fA~Awag}waqR"i~Vr5MFw7sRKĮDg[<;I+5aTBG26dx8tV\ym`F.>/ <)(œZ3ȔAl4>FA\h B re00n0هب~=3j:dك2D;Kzh*EZ5SCIo=_N136,WU̔& VdL跘@ )RIkfVg筏Xt:&?+k͓^퇳Q/* Uޝp W 2.0%,DLT)"!*3Aj59#E, ӓ 61@IQ,w+TmXM=RU*la@[z-|艒i ol>O/ 2pEe AZ?OW;F\H%Rp6'].(wOKgh*ˣg:Hhpu@PȞ@ X)A .Z"a]81e֎%nƣB1jWڼVF )J>, g+74&$60y$Bjc1dBELlM$H¢&H6>!fIVkoy&n{ؒť0 5?jEe(:Ym3>FxTt xԜPΦܷ,ɱH$*_;d9cč]ֆД !,uF&HLZ k&n$ѫd'@CuV%]]!%e.X. 쳵,-wb`Q#A(Pt}g.=6;j='0aO|_UT .Xf̽@BywEfOB?#pn2dqbl'bH.I)ݯz$) XLUUdۈ%<@rFϵ,wJÃ&eĆ.y)|!v7t ?wџ|Ϻp7d,sVډT$RQI9ޙH1C-NUQ%Q[bJOⴍ &1五bҹVœN ^Tیu6!Y&&VQROiƁZc5+j$ɲq^ĕụ=wvobm}.%2}a[;QD=x||seKQ\6fTrmCY^qx`W͒<9v3IG"KGJX>|,Ku~)Z,pb%CKsUВsY섡T{YsmcY&ޭQ BE  !lj9#בiBʤPHXƭ#vF,B0#qH>HԔ1тFQ4`pHnY.n5Lqd:|UlJ/ ^>:\(M> kYM;~YP|%٧Q1k`E4 ID9 htAvDH\q 0<08/0W>^SWQ&P!:A{ĵ2r `xZd =!"=AZ3E@R2yDpC`5s$k;YZkaBvp7w1Ӕ^@wgkv@+4SW өeCFy}y#v1l(BRj1xkO#tD鮠D牲e+t~;iy^("&A!nRrXo"71XO0Rmq*5D1;iyn0 nG- K +a bFT8DBw^{CUr ɟ i4bf[a[ hѰfו?0Q Emkv0ulI^j ݮ^T _8REBM %5`„Qd#CJuaAӂv08#̕}7ZqszÌ 'HfDž ca>LUߗ3Llb\fdq:>RQ7Lq0yѤo^Z@k.9!Z y[ مg7|sN+ҟ|:%Vz>nZ]rKKm6jmX&da~ڜcAʄVc3{=,:Zk68*j߸IQ(4=Lm{w! ŞD1s6Hƀ`KG<K %{`L.e: .\||i q P&X_^"dj Jqcv MNr|.]Q|L> nDM{fَIսK{=`2\VJgUp$Uq` V5b-*2~[՘~VI=CsfVy][ÊhCۚ/58hʜqPw P,̺4,Y2kHzrH|gh 13fZ:;j+Nw,A8!Vz劓to Zc J u^@OY_L%qk/).[ 8& E)}}.<ׅ)T,CW4'պd}̎~ n<-i:u)&=}`;*:Jwf"P$O|)<(nƒ$( 6 ]+*zlsr[bXO32lP6ThqV[~UYYq}Qį'~;hV14}VZ*.&({ci2j|1]:/Y%zF*U"WCDWJ; #Ŕ< 9QW\*+r˞Ure ++UVv v5M$׍$+U͊[(eA' Ljf'7f,a4C;ތcP%巷﷤r.aQT{63#oǟ gF)q_|?(JQ:yڡKa N?{My{3zE*NRb0pfW#(m64(Ass)m]0?\܏$/?j%>&OWNs rV-`z-M3 )rT cHy`VqY"tOS&é W"/ޖKaNRʣoff6;:b@&#W^qAK~l6ǝC !r"΍:,%e"3;V4HuPEؚdR;MVC2rQEqSpr#fjːLj21j ^F:U wM)l@k ڴFe)!eZb"Gg-_n53,-|bCV2% #_z5~_בqwӶkaZh.`]TN1E($7a117D=- t 30 ͤ ykDh|A3PnH3|5HQR"%ADâҌ;$VD0< McPaJn?܋uJ!qO"C"'HU_(Is(B!rt:O?ִB2R7( ވ4eag-HGSPi6XwXSst ݅Masf)1(DQ@@3̶3 PiCo!hd~lָOɪ1xs"p0Ew:q)6[&¸xۮR7p̢qۏ0BV3e0P^ o"U3 CHozI$l7Pջ(\dk2@\+;.$ _]I)@(QL۫SU_B( 9<%/JAu&{AI43>d)F6Z1֧tVO]ca|xb~V=x*&DĜσQNʱz9+Fl4t@|to10zjjIƖfHc3fV,ZF`ƏGëzg6=X4J֝ljKhuB>z}O c`{81V{* F &g8:ywoӿ^oN^p:y7'޽/0`S5@IQ o~ڠiV^4UljՀnf.7{kP-bQ1n0s):UU_욠z3_%+#/.)oRE^;(18,eԢkՏK;Uʕ$<EsrЍc @'xprВ"#("Mɭ 5G"y[J?5a`Jx.M#(4̮RD8EBSݡN+:{52-Fxc{ǎNct{\Ŏm[x"n;v3%;m/9);/-3 3rKFDٟ89N{t?̚!kzk[q{jQ_=k"ApX1`u[ M+: ÚF7Wd.1@+zZkn1GeL+OF>}ODq #r%yTR#JWK4EIG +1X l⭤v_0@5X-tj0ب)j=m<~C4M7ȥPM&j{ʽ 6ݛ|{ 0 [7ǜw\mcx/V:ނAFhM!19NpΜњ*Af7W؞_[u܏+tmUi˙S2oT8d"HsSDp΄F:8BSӪK r2Y+냉KM #(H8 heu5r_͡zwеΝ,Vu]Nz:'EģlRD=Ml9R[$qBA6Î.Q֮ịUh4 leF͝QFG DF刊`) NP*:dZ^ Szx:cO;sJXy~NQ+:C͒YP>ӫ~17_8!_ZP!GS*4iJ\ A*שQ!9o}S![Ѳ28P~cZBPs'F]HS6#"$"H6""t9Hi H>IOղe[Ȕ-{z1tE6Sd*2+$&#Tvwp h8N.@Yd!zbAa SM{)̕ h5qvOrv)6vvg >_]avMc-[*>&0mM37GsL_-,?~I&SuQQӍV6YhʩeY|u wޖg0 -r-!c&Lr`ʁ9aM% R\H|'dY`q.PQwhS!_m̵=ڪ=ES"j0ۤ o42v5#{5 :)<=^{^$8P7>RhxO>kE`gk{Y#i VbY}քO\g vB 0)D!pS2sQ5 U&D_3 IG°0 ~iP̥ KFh:%Ƣp<;\ b .*Ax=UVBBϴbD.ES6x<c mc"XXnj.fQ^H)&:I7VmЕHΤ(@/QzizyRhU(43Zą H9{oIz-3)waAp}c&%VYXf.% eCp ;ImЕe5q<c9kCuĘ0y9pu{t Wz\k) YHp7zvl^BnmQl͕/Dh{եKDb1׿iE*ݶnVWO9]̎C㰽7 dC%n_»+yPN.׷ݱ4aQw莊;'q;#[uE de^uvkv:L6,6#\ ms+O b=hoJ-+Jh q̄ kƉdaG5ٕ:c&0SPd[5>2r2wZXFh3d*0![A &{mOɱQ̻.wTope ʨ,q\JbF6kM/B'L$ U53eXIIe+w  .0l٭|-DKId5NnY6l !**+J.0, L&, Fh@5LUQ[st&YrAГb9j0@ $\KϹ j#c5qv#c=R p8R ,jN,@g| )HC9TDW3FHqR ڐL񷤹H#!B"B8{ HbtJkFɡ(+pqŭ։q\\ t쳵,-Q,00ԑ(d([~caq(Be<Vѽ]l(7mNpy?ˏUdMWYfn1(ƐݔP M74C\Dm1ʼn; }rN서Y%'fCKXS0 vDPD93dL&*QzLe8HSb:GT+9˘0#ug`hO(x,cr(PW8m + sxDPEqbzӬow2Wl9TnWD+>̔K9+T)h0]uӘQ*+1ܧą`(W,R%I(g#4dƠcXh&Ξ>_RASNGSOc$$hH'#7ZJeYAZK`= :kvG"I/tYChC]]XyelD,}ܤ H^SLXHȬ3;DȍNTӒ`jF1F\`%ci Hɤڗ3n^(TrJ$ttlb0?Uu>}l, nY h9IL|J]cf8%[̶A_BI?}xy13Z7فt_XOK'Ю?a'i{28 Z\NY2 sEb `>hVW.sۯO?\C8`!&2"MGK*W[0;FOS'(H>>iz9N{zx.a |7W7o=&kۏtc]>ǿM&ʛ~|mзwۑec]łl6LK2%L#4h,W\#/E,h@2lQDkW j ]tRbx,BCZ5Ъ8xx;yg/'(1#!7k (*"dNX~ NIQΦ(UD֖i, $CD:3/1QЂc\+52,_}9>*lNz"gO]~~*Xz,GOuݶFTOJs}ʵ bG/k,IUQ8omTMР%#(K(ዃXz.~b`NlӨ̣fNX AoNKgP$ףf8=IY+7Frc OY /8..ܫNt?6z'ǁ;RZ{v%k̿BHl "lg <(}[~+_3uŘ.1}^&eG _JyzξbHZҮj^&%FdIeU.=kG%C}V#ct]3 eC)-I^8{NCXٕOk {XņեڥѪ;;Z٫z>]{zǟi]hSF%>u~iDqt7ߋz5.G1qWRnnN)7`Y̐MRl7#Bu pQpȃv;sSΕKkyp?+=>o;YdQH riLFgณ*S҈1@(̡/@ZD9)Kn&%'A̅ƒH-2>3Wٌ&Ξdž&IR ~q 1XxZmZv2EK,;bٕ<d.ȸ(&Xt0AdQqLf@ B0a7X”D4:X ir 927Y!zHT݄>@:"I!r DQ$B+/HH$RdGQNJ0 @Ϝ'.fAL>^쑋}v^[(A%vg$ ^QӛLouLd4 4M܋1cSIВ9ƓˬĔvQ*(0 zhc2JsVu00|J->Kłt^왒!2䩂<8)A_#_ʸzP ▗ѡ} Bdh'dUԢGg5C!C*RT۲R_!d%Tċwn4YL01hﳱX\p S )fWvNaI#Z=ťT Kr;xX3.@rkQ9 d$CTsZ>epҬFӜ.m>qRܸxgi8˾_|[;vC>p+ޟ9N5!8Xp _80Oqr#$QҴ|m^nv\ 7Og7~"m_!Vzͷw\w"x-)纊58/~U*I؟[ -Bq- wJ ě;g-28Ur/ޖ I2oQ#Йo:sy|f{TlQ2kz\7=ov,Rs RvQOyԴX [Ӊ9ܽNݥ cQcu`X8tTg'Ua jo-\'!!ĜTPu΂QI@^Ix2m`abļn󺩴Ta'O<O޵6q#ۿ2/~Ju+{cMvSƥbD 13|A"Giql  2p//'͚#J\u-ZAqZ)!ֱj؎K3O('¦ؐM{1 6m7Ih.y]7p;4#<8GI`!ž# ]GHRR^ 8-`,EnjoT.dA=W! #@k#DAדWp0ztW|HX1rɁ&Xc.{~3#5#5 JEM'iu54=RWI`.F]%q%u;oUgȭ$S=RWI=Zi)y*IIdVW/P])^&дWiwuu\Y]u9!x$lo$*)uVWߍkz|wO0껟7fn!?[nt;?)HzMwU쉺sx[tkpAS| iA->~0)ڭ?`?y?'L29^&x/er=~u(z$j/B8 ;7'L|&4Pm{}ޔc<fiL4LQ~ڤ̫btޭ.n7޹w^t 7c9h,tOV|jܡ-^~ɺo*b) x cs7φXYWy%7:yXrZN>,.Ų :Mg[&|<sז7]ci= ue ˾k!J".k&+0Ԑ5HG5$b[Savw^&{;O&adlHay-Xnc,Xm bt0XG21jBm7 (W)0!S&a5mZP⸖[qyrCk_zVP$]pFZ=}< c^uw^YR}ztӛ%ZfuB* !ra2ccpTZp)(3f\aBI$ֻ{$\f. :Ny >(%E RrJdJfi%1"iH&5  (\v^L2'IdJQ 9,TmEҜ ELtgSmM,$)}N[vقt4EL*DÚ" ;{K^H.MI#Ɯ !4RcjQbpLGI|!Ŝ. O*g*SM0.~ΛzgzRS[[7Tx*ajU"1P4Y2w> J/\IS #Y4I; t0!X[|J&FFcPo%P0 fUtb/+X9FN*q[ם69igcZҤ?Uqi*%R^_\~{WӁ^t+:;[%huN.kuY_RZ0]#UZ?LUhdA,87}RP+*0'~+Dž^t ӷ'o޽M~x};})&o'޽/0`S8X&[XXV?u[MC{Ӧbk4-Tn|v%oKڽ|EEۄ#϶jQyu߿\Ǽ&T7?l ӯ5HMv7LG]J(bAuUuWARԎһ wxDX| 7"`1uK)K01PiḂmO~0ye|ЉL#x*Z%0B$T]h FpQhxJ<'-ʗ:\TjksbôU"M[m{ƶ'67Kcnt:4aW_Hrzҝ/OoBlK*Ĕ#"᧠i+wDlm\n& rW={ ш +!F#V ,X96H`Zp4¾ɸ`dVP>@'bT4xaC/:|Dο%S!X9= .){Q %"S핍AΤ\,z6$,y >.gb+f}goU \w{D"f瘇7>&\ 7+oZ@H,d": (=Ia;Q9)yq%Kbr攥+2LC2 "sS4m"uqF&?wbGUgK(]#t K3e`ny.RJ4MSωz$ь*Î.W4v=@1VYEy` 4wF-#*$;A ,iy&+N;;X`9̞zLɖg¯Ȟ3y\)A[,+U\Anë8S~|kYN=*f)]rUT 2# P ш= ,BΏtL A B{.H ybʂ儔Fp59,QX:=V)VcKYB7`3ø ÌƖ Ĺm.LY Sx2 )vSʳfl f~{݀El>*0&*iz?KF$BvaŬZ#7)N_Twj ȉߙчIH\?2J`ԣ _ydW'_޿NZEW) vR`07% k)j&̞?oeuj1 >H S)Axv m^7Ճ<Wgx6a} h(oWAE]є|7Oz OQ|haRCT03#<<(`6Wp&j ꦔ\-HDڧN)Cd-x -Z)3 S)^5E6{r6_DLe: Lz"8jtt EkV;n V=k9&r6ȹQj cZ'RHDc}sg  i}jx|-~s_=gnڍ aWyzG-Q*pK~Qj^l^yIH "&"$hKp4 [b Vش'C0!B*Caj3jHZI=6MA"+%e5q֌uw5^U0KKvf.V&Gӷ$5*-_E36{O.~ՃqU'!m-JFƬJ//M2VWj=Ww1xKJP*]֥WWS=x]z4 d %f_U7kԼPr|H{>]}::s~Џ"uSq}v#qnzHljN?-.rNe?~t%O@Wh*rs˾)%#.UD}=6*ٟ[p90K-4a0|Uanhz.;;k)v(R,?\!3q-Q gD(7K (gF첷/Qٻ7kvW?VؓKpu(5Tqđ$X6 X!d4D XH[]!V@.( ~]`$Su᭔HjkܿPça崀Y7ϝvy!%Qαa{f?+&VLbY$&VS$3 &)E4S֮D0ϝ?(V5P2zE`P>RL@R6J ^Ȍ ^`A ƜJmؚ8kltak3cC]mֆ.(GՅnsƓd/+y= |u>u5H b0~ʃqiT "%&P%5P\ao3"!* Cr'ZIRTH'e `-P:"0ĶĹ_cA/[jmܲYkg vKF#*G (HCŸFY1T1X >iUF.qVqC3 t D%rIx>r*iq5qׇS?Mqg|ǦIֈY#oH̛h)CG2+@Di\~4I< D((g 0B7Q%#1KoA`IrC8}֡{8Unzg~dJKy&S KHpF̡0JnpR)v7]: wd9 d%l :z-R0s V.Q < h$w!r $%@4+.(-B-/,@I 1" p8=1BIt10u򮚳š9QJ+=*㨆VZ<}\ʋb Z)!U:mۏTCbMkUDorn<&Chb<4ͱv3PMoaj`'[K_ыXvR[K^?9N:vek+j CB m 'JqV~ uqTƽ&Rr})Uu!XR&#sA : Vj8;-.䛋q [}E.M'hnkC3p]ryE y8ާ/%Pz-6ܿ 癠@l9y1 TssŋI[+'PͭԤO"ܑ?'>R bD9\R)Vx}Xd$VKQ1hp^=RB}u{r:= ~Kz!7ejDa xpjB34]HI.(r2RJĖ^zKhzL\ ֋6K/gDZ)CTW{DS9s0$hBq4+qqՎ=wvº6eyy -v=шM4Jx+JJ4 "DK9":Gt־[Y&t퓈Ct5o" Ar$Ph%lT)⌢iZE=#V4~o)>PX*Cr(GP(+V |xnVi-9ZmeT8,7#BT\b Cw=67&s4>}ź{c4+Oc2TTR <.m]LL/"yƾ3r'18[wlQ W ar/nңBS*! GĀܓ*&bq1(b P*9?5rLQygȲ:pt\5Ί7-L1>|左VJ^eQqj_8uU<ŜoU+2D2E-peHfk]5hdbM r9C2M܏3TGqdOHt*dEbN+5q7W&zpp6м|%(V|Jէ}B YY..\D7iyy:>/\PŬ$VflV(n^vm}46`Ts[_*nivѼ)F<~rv::0,s4~>mƖخDE5~D{bgAH[M֔Wy[5ILk5mոO8y.>F׋g:'j͵.:Vk[_.kX#brїXSƀbN7P&~m V(?::ǟ>?;otyw#0#z5'7OjڪܩmWUc}[TLje[꽳|'E[CXnSwUq8mB-2]'?ikE<̀+*al6.ƫz8-`^Dݪnv5pjbا|3]v|P@%.O'[q]k}?>S#I2IsBK1,r|`D+)8a`8kݓm{g:Ϸba*rd۟/(qϳ6CqW֣!)k 9Sc[cu:9ԙ\J&Knsi=6Sv^GUcoy5桰ϰ>>~BDv`GN;>N'2؇yp1rgn _&8$ԗqrJE _zʵ+0d9 YkϺCfE"h2R# ІMM@3TS <1(hvjS-W1@C, KN٬LI\"jO@uuΚ:q2>,WGS՗㛸 _f}|ߡĦ^d5 BC(!4Bdgr"Aw`|nd)װvg0+`_qo&=iLgڮ..͓6rh䬙G׋bzCOt ɸRj;"gKe;ၣ.rLq6!N3)"$ӌIPJ@\+{MgQz\MQk슞]{&vMr!i@C*@qWll8j։+].MpUY_9*踊'͚k5(+TA $c:1t ^J`m)Qpe !aJ÷SQ##Qm-p *$l  hylrj E#P (*Qϸ 2 QN8e%6<(::kV\/RA8gZnXZ{c8MQh!o[QޖZlnAaF3MJsk^PCl$=͕$=̭5ГVZ'=| I5RWkvu?+k[|qUÍ߇cRM݉:h:By[ $3-'i9L:2/WqIss/4MZ'+gH>p45 ( Z|cfu^+qyjEEc2%ٗU?:J WJwNj0.9\(_|/]$+ðTJ.'FDǨ]z@GlP.Q.F-U:4.R1d LP; \:]q.pb IJ6 j΀zj1ք`t֜y J^Og̍}0xl]fmE/fu~-I!?ۺ6ue \ꙙa3/Z.y{s8B쪷ϻv,6&ۜZ#msэ 9<)g쒩%ڗB$U:E(_T:ޟQ]ʃU䉫d^nRig9eD1.@.Sh_XQƳEm -~^(5Ol- ˹͞Q!N AR-lB*yqbXXlf쉅%P vs..33*z}雴9wN&0b@$a o":cPV(qPGZh,S:1S6 Q9{60Z,&Kpl CD02Wtb)qGl7.6;EmZj#b1lDyp3"7#48Ji>+IHeӂZ )yT5e,2 + ԣ(QI8F:tǾ #"q@m3>!aJdzHtX/): }4EHȂ 'є@NM PKg\p>*hI3P*$$U~CCȥ>:͒}qE>​[##h)C xCMZ$RsF> BAk\<.͎} !쇇G  яȅV~Љ[) W(QOzg}v+pL%N&)cl[3?Pu<3V "}$V(<K)U1W:G@493Ņ*xT%zUOj}cF #>[bp"pNNxͩdб"aWu]vnN/`n-ΠJ@0υ0gSⱯJԟT8E-Vȝe8kdw1ƦD?,gNGI4~V/LHo} [?w|w>ԃv>Pp.r:kѠpAbv*MܡQ$с>ipTȬӮuѬ[_7~ۻ[)rjwq45 Nk\BwfOFNQ{w62]O;zP5Zyo~W7ٯ-SXv'g'EM ̯}[[z`]\lJLK2mH="f-L JXH)qU$|ف(3(V{q}j}v'V  $9] *d BEMrh΀aZ9MPMFeӼMj3sC-G$W(=ʀn]x+QE(Rhfq[z^=0Kkd߫z{xo%2W7XBH@CuOR rhQ\Xdp*T^kfE4 l_ͥ8 ɴ 4dpGwc-Yǻc),W(;BX&qM aYH9: _Y)BWE OL3HD>ʜzTo*:h2M:e% @+Or54rg:+-)1ZU&{+/5wWhy=-bխ}azMY+@^ D~P L좄ی++"ΪX+U.j[9 _T= y(V5-+52Ab$FJgHt;E~crDo0A զlry^YCmD;T1#֥ _8Hfe*ɾc4/Wxtq+#A]M[DOz6ct^dHQ0[E )*+hBЁj~Kj'ssVN% :>(ϋ _F`0G0pMޓ+@8OeC0ǃCLjM<(ÁDVHu mm"0/.&Ύgj}^/Jo xȮܰ ^߷W72>Es6#X90r*!pDt<TDy;O`"\T0G5 A!f0+tD'2nhcCe DGU!X!с ĥTL$АfT Q992;Q1xΒj}SCatη~1Iǘ?|srmwd4-.F7Ƴhe%(ׇAnxAV5F`ڋ !蟫?n⁑2߭}~SXT}Fe]x8;w8xU@;4n+Nv2 j'l[[$jcR۟M<=kz㫶s67X\'!i-utPs=a7mk&zY?"A' G,_HWFjKR|"Gv{x_%m!F&ˋG-t >ر?-l-?ϯ^ [YOgxX|#/~ѹ}xg+%yv;LW2y-ܢ(hZfq[{^2K{芡/jlgZ8Z_^'zGK; nm4y;eApՎⳫi,Y9 3Pv~Nc8Utvr]ܬ=(|(6XGvߧ>c#˸o|khh@oTK;GةfٳF aKHY$(H9wX#+cC]oS @=kdy d=v^.e_|6F#h!ʣhcdiET0X4ТQVA x0Y\F WYJIzp:g2}fя?a?|q?5}\5_OUإ {+ڢm,b*p++''Ƴc`ԸpU|Vܐ:/' D+Eٖ)tkJ%AkJ1ܰ-ؤYZe}(St#tW\3>΢:Rq@eJ^"Ǐ P#5cĔ=C\y)c$gj{ȑ_~h,;agf8\E2X<_bQKL۲݃Xm]U|7>AJJ,&DxEUO˷:(=.#9ņVz3ڡmxq]ܐGۛ^nx-ZW4.報-YY}/+"[pF[Cseh,!#HB:`>w'.J.LeYeMjz- Kc*SX-yDH]<*A {5(2܏ifp0u{jg)l2RehR K@`OY p: Л?Դ#y'%MB'2MGY%V$. )in#s:sG1)όq PsXʂbB岡~iїe@dPezSֶ%jb~3k.^ɹ~CpFe"qJY\MY:.ݼ_ZWzgi0F1O#A(a/e5q'L1-qt>H%$dJ( ]3l7"R9Cs2qklnv^[4jׁ97cx`~ZY\6W/kЛ75Rq#fNs8GUeGJѕvyk+'NhHw i~`vyIKu}S۫Vٷфr0(O?̞t7veh0~=ޑ-0 xgOJ鼫ٍfͪh'z3dǣAOn<=?n9=˩X۫`{]dW]jnI id|{|ٮ_M`%l#>N4JClYc<8!s?_~~|x_>[qDO`&!Hw!^X7]ѵ5Zڡk5zwꚷ7#T-m).ց|?QtBj=eӪ/gMbħpp=L})ʀ&mG%yx}#PkIJ)yY5IWNtx!L(hF"t$bA !hbȊH愴_l{ՆEkZz@+ˈpn2:(ܟW1n2(܍ 4f Cp>E$bԩsPbnlbO17a늹>[;/6Q2Bv>Yh^3'b^ ,}7蛯Z(,hs@HDlMFf$HutBj~RC{={(dB"?3Fڂۘ5B_hjj#!QDUye/\P'tF  J@Tx4hd @ g;C˓6VNNǛcYJ2 c!Qalf{EVALWb E];x/?w%WK_eRJuɳ+xq>#58:A*'&TtAh92%})j%Ca#37LJ\u0ODa=#]2`x>4\s/_ذRe@^sri_3Ænl^gmҨ?VXxDdlQ* A/|)Z~9Bs)hbi f&ZV4Zn)2p ]u&|,ARAq"W*qYf.sD`h`~at1dVz:b~JV q)A3ͬ>hlN=oI13,WuԤuGrC&AL軘@Ir+SZXQ gF5ĢeNG6_7KJ-;SSE*{^^g]Z^dӖou69lKKq' 80eijTޞs,Ze((\'eHm`2 Ky¨Djd4`s9VU%z# )B 7sjGjaje W3&kBXp9roӫ;>lJN'8?W.hp8<<}G12ÄGNyCN$-JJO=3Ar%Ȭ1`2%lJcѨ@  -QrI91e*GjxW(Djֱo散6v`7r6rm+Mf.c&VXJҘC`,AIT%6c1dB\F,ŚHc2" ("B c֑ }jpÖ/DpJ`<N_>rD}D#Di 5<N(d9d 4'BB+1hI| 8cčVl,9U"8b@P$íLHLZ(+%@"V6N@qq̝:YɾqQV}\X51Nz*:t< bIb ,$*[%V}㡪~B2+d9rB ~B)sIp}$k;.غMfqϛYF} 1h,nhL.| QN|$AqI\ ;$AV)IYw^PʥXS*G q! }2-jQB[Whe VG#91%Ba#=@-"h൅j gMyᶋW!d c;I$%:,3R!!M< KC0a); r.k*ɓU%1R@2*ВO(̑Dz (fς8sœk"ļR2-5 ӏXH♝HDȭNWۯfu'#<,{oɩ\`E" dH-:--kۖ+-o'i=Ғֱ]FH7wQ[B54숈ѿ{DoS_&~i d21F\mz{isܬFN^z9?Y%]\OHm:xK1k܃іeӋi דӋ-kDj8'J+$"E6[&l(Pp d]XpߍBI[ẍ́,b=wI3}/k'Nܪr~y-աOMfU^Sy9OCC� U㛀7&)Y*}yC-KM$dIH0&%P8\FF1DpTʛVA[H/l &MQ~l50#W&Fw| BVH?IxDJ 6kL.[p&^B/^ B lQByN$F`3͓ND{Bsե9_PqmtTOHiQ2`Ϧ~uϥ~5Vqy(WbW/Ӗct0-߼R\EOsRRZʍB暠#FD#Y^ ɾ_&Yc`6W%)JT /9) %Ji-ΰ>+.*^gg_#=$*8ʝ7\ k8P=d:Fba)[Fv98k5I׾!̂2ε˶ng6ϔ`:\YRp~:k#ORؠL bMx*^[a{𓵟Ɛ**A1i!FZʐ17JKxؑkM+n2ނOLhku/ᮜ2UcX5e=׼C Dwn:z"눦Fw4mjcӖRbs];rb{z9mo7ZuHˏ,~1+NAkzuێM^=Qh7i?[!c@@D:YSyBב@cMp.[ƭ#vF,B0# nH>HԔ1тFQ4`pHn9f9[Nqp>_ T6A67T~47[5G>eZk_v7M c0, kiDJ5k'L3X tAfxfbx HxM?+ -T"uqm{ù6؃v@d =!"}+aeDpC@9 Q Ƶʪ5r8ޱf Ywz^Nwg[lNp I*Iww7S nye /; l_@Ȋ#v1l(BRj1x:"PTKeDYݲVFqZw&TJS,ՆhRA$c:kV4O2׍$?UК=xʥNs3#*r0#!;*%41iy=?HރRX 2цᐌ(j۳ۻ wڞȖ6x!Bq,1DXqV{f `t6ɖTsoi+YƛJn}O!=:Ƥ&b%Z@ {x[Z z=ۦjt|,&!Kv)G Ňh?_?;]7:_rNFhe?B9Xܻ=U"/hiHYTڲ \qY26߱ UzNĘk7/]W+tMnMx./i5xĹ9Z֪㦵gonp zw<.k, RwڧJav1>.4mX 0׾{='l9eY)XN/&i?Fê \;Ix߲t{?^]&4Ϙ7Y8Ͼ~H Rl\[-w Se8xſ;Ģђ՚6{1L.pY͖D,CCco+k\Q! |bؿh}zw4P~wx{~9'U[ 7`ކa=Gc1zĤh):0W[b+ؙ̇wڍDemsaz>[nx9,+A4XBuiY cdSpzmi)"9\ʩ;Kl)ǡL؟g8ɔ+Wf|yV0t#gVJ&u,紷x)C2~_[kWjZ1l/hzxS7ԁiAnD-P+S^+'aXu*RZjhĂ=c CG2Ha`3xB`uR%'X]aƹ`r})q&~>I|FtEj@ԩv%y t3唠Ck,{qYWًK?=`n3CHO{/޷"\+N˾W֯J䓝ZԹ^_N[J.Lw೧i܉`B4D.iqwa48;Q1>,FN#8Tlmeɒ`2-bC-fzᅽ$Lʹ" ėjL=_O2{-Pmzl,[8o&fzh4ƸO1^XF^x|n\̇>lܰ5my_n}}37~M/x)uX ׺4 X?.-p~q#P[uAR$p[33p8R!0OsJs|eB*=<(/fKQ}DSI^:!X3Q(I&\1Rl( eee.+DSA)P0 qRFͣ%ȡuTRA@YV<.e 2x\q9rZ[Gp3@ON='d Г[Rf Гz2@OV 'd z2@O='k Гz2@O='d Гz2@s@ d D੠zMv)F9`Fy(!?iJ^OG v~yxua)0|trmDsbm@!j#aƬERP* f)̸„'4*Hww$0  Bv")9f% fi%1  AyݧAܮX20n 0 /~ja݆R`'!*E=(XDRvUKsAj"'J*X[6d4oPia06[€i6XwXSs`oɂԮ '%b̉q B#8Rc: h a;;sJ7ηπY QZH::.E.Ǒj-L)\cS*򵽚up|ܛt(Âh4OgC]]0ec)]NxIg}ɍ=3w)VWޕ^_ެfpтs?YT/v%HT Gß.f`Ci&gb|LMӐi8RӬ2ˇ R? ULh8/t9g?xqVnuM6=RŬt2RVP|7~PbFwq1N:~kmPP *'~˅πutͷަ?>;y &7oN>{ ;`S6@IX X7?0nkjo>5Uljb^|yćStfnD⧋ϯԥs̫yly&xdp%u=1?LCUTTQ}Bb*[C;I0Xg:g%N%E`Gp EP)sѣ s)8,4PU #M¤#B[4BU 0wC(v*rsP6= mlЦo!G;_R(74xӴKOz|6~Z5ok!)DhPARb:#"AsV2bk{%Ȁ}Ċ#i_ ÚF7Wl\J]2谲^5>O6^(eZzs=O8C #r%{\R#J WK4EIG +3X AOz{!8Ӕ"[D/ -f^/Ԝ87Oh:(IZ̓ʴ,۶NβokկNL( }QU~nyu3|L?\DX>|S8MB0lh<~>v+#}a'?JS2H68Ѕ5rޫ?Kr),Bv.g)#=pmַ^wmp"/y\[Zuq捇u\\}T~U,+ /j~wfR0AjKlj/5yLhb 59(R {z J[ǜw\ocލϽ5a-d$;V a":ة0g{x진x jaеtz*ZڏU!-gNYʼR)(# B,/m`s&4ٻ޶r$W~᥊06{yY`1`EH-Ȏuie4sEa}XEƪ㧨:JޖdgeuivXDC#tݪv! ۨ&> Uaզ9CQm|$M49^ˊT,uX娱R]Z =H: )9+5djI{c0;DjVwg?ȳ^Ϯ붘{/V g Gbt=#+S-˘ٗ7w;{ҳ",&![;:l.7< MHTfB5b'2!&db 9cTq`8y{00q{ȭ殬 -Wa#@coow rwČ|uhceY']Y(k}G(oJ ILn61-fcאAaUø_W\ڲ ԵS3ޏM皈=1@d%kIEh\!Ye+RaS)py0|36SlK2{'!B;0d€MΗ~`OB%`"!pH. SK >l[٥-GW3(Z5v)@8$nS1V|1VЮ `6*k1!x畭'ug?a + jDgI>ZS:ʟ CW]1RgjO%_ox'7L)9F5-ndK#NeK@C 8qE$tMєZS$|RK#!N}RR) *Pб&Ж8'DJ\YJ)Z':n^~^CٗE-]NӬ^6N,k?_-?#[bN̯ˇ 7FW잮W]_}^)uHWrhEY6|L>1om5*i=Hòe8=޶SvCi9Ha5[v~;_n>獖C |Ȝr~=螎W{ -NWvw䙽_7=4y)G}ڴFl扗&c,e@`ql!9jLvNMvij;; s*0V2zIGgBlG0w쵘j&{o} d 3 opG'R z裶6B 'Bdb"|MI/]mؑv55X]D~LֹxcцZ{rgE&~E-og Ehi;EJ^Ȕ?a?w@Pݮ? xUJ66 RЂD=)[ `A8`N~肯̕H"ZmzJn8=HC|aMZ#* qnXmf'{`! Owvs?5f%y^zx 4\}\>q'FlfI5˟bPL} ģ2J.`М#!jb0`+04c/Q6 6 xYRlyWX KUهaُt5 \P8ug#j v& f٪-S&M"$M+?S 6а 7])S@ b! 3 Vt ְL1g3չ +BwnÅQЀll^XD4ь8"_T(DCN p'*YKTT4Yb: Z)@ ձ:/JԆ3Ĕⴷ%,oBb(.M +Ipq\:'_gYr,.θhG\qqguQZHAAH1G]S AUPu 7ʭbBAF\<. v!tC8Oaû_6 s7F?>Sc 0rv5ٰ͖ò:c 3aZ*AY\mzca{)͙ wvAꌂU gd XmUP93 69UXl Ϩ8B} 9VYy P-3kUaa ' L]8lO>[ ǫegh-6j5I.<)/r|miַ;W8nv}јPLN#D*VY?P6 !gTd Yu)Gb0Nhc[Ry($l8.HŌᨸMc| .Z`$Bd@\uY2 6Ӛ\ܗIȦ| B1u \.*X/dD^ *VY)d!>QĽ t^RW1̿LyXW#HeLkuVC եvhK6YN5UJn^}ׇWFV/6܂;A;y܁Vy[w*`}ch>Ao/~7Pni>p?"WP\ʥ/f_%_J_|9vӬsi6/'L,]<25ˏ2/j|+Սb`^Lo"\毥"r=ظǪk- b:c X衭iYm3w&)4w Q:z t:+ Qobd[ΐkpbÅG܉?-rt׻ruqW&XR\LeZ5Y@"NjA^|/OO>OD4U_Jd2rkί ks޶X,wʤ|؎3&ܝi}jo.?^6Dƥ-L]d!V&ZI@̠lHA6$ cyȐG?Y23?svGԹw`e@pA^GLF N&60\>Nj r†0bIX#z\*S^L\S=JGiG<]H s!mYJGJLi(@jRtj!]Z:@m1p[~^ ~CP*ba;jȗ0rWzw}%Txdzr"SRd":]n+9@Tډ-Gg W[fSפ;9]|o˟?p-bÝVöXGLTχ==gl޳{6RPhP{6r{6zF=gl>*"t[vʾM\_y&-y&x9 Zo^LǑn5+M!P|#YCfy>lZuwz3/h[IQ`'FkkF}qS d/h7*==T23Rkn)Z }jH8LFt'P8[S{&~%e]7xenڱ)P٪87w"7uuV8dok,m-\m&ԇ'pq€ y@|{#B10(^܀[mF1ቒ]@xd[u֜ۡpUbלU![SHXlbdL*b_dc,~bM Iv]2GM! ` vݶkQ'|?g|wq3#Sy&|qP M z!CАt(#z u$C{%Im?vœ:*۽1ϋ_UF忶Q+' jiDmP^$! eSmqE$[v{9v@e;9+ mnK}i^J]&U'C`>F,At/*]bUP6jf(1(\ͮ=[gD׆<<\+ aR S)Jb-*1[B"qD^`2&m޿wqo3#XQGFR ͨA8j `Ɇvfxj| wmHL$ \W].6*#9y+CP5(*[h@wu˽w';y})E̮t8$e.umB4 #}E< p]v"/,L>e\'Mx*1Q*Z^2BD:h\dp4恈2~e 9Qic8j~|+',jݓZbT.S)8Nfhd&qo<vt #;sdZ>YiTpUaY> &t: =sz GDb"K IgfS}{u0{Dy6tA'c]\}'g%cuMy¦&))rtjw,;*ZZ5ulYcW 8{=(.ǩOAs^ٝ4$+qbh,xY5V8D$\&_ߴ2ךApr K䅚Xh6Hī)Ɓfм=={^|UwK^V_FF& mA5SPf r]ϫm W& t#tێ56 c0*4MiMtx;-uBijpPwZ~ Z"&f9.'._7Li.Rgkv(6(sz0hܿ_nYu-܂T/wTez[טbUU$c\h1*;:VLDDfS_GhB2Ҕ\ &;cڀa";t:/v,F|~1Bup,7ϒP/jgd&3's(7Z&G=-{XgOOEK,>V@QFtTQ؃n)-NhҢ7-'oDKi(8tL A p7GBJcb0텔F09^p:=V)nрiN,DĖP=[Jf@4vvF:!庼K8Qgщ1zMS ܥ"]u8C37M7b8EjkRnrJjqIu֊Hhr_-am!*wC03zok"Bv\RWox,Ec,gI1W7w^* 1A}0Ax͍QmQ'(/_I9;8mfRf0I){iL qҹ }qg^]Pq;1FC EIu6GV`o+Mn{7o^i"2$ h؇%Yw>h023\Q3sJXnc,Xm bt0X@F;F-b a*c&y$ MkQR!y ^#:>0茜5m)йm0QpaصA7 ˛X4Ok%l* %SX¤p bQ!Q*-ĤSq %Nh&U[9PI9^n, 쌌&DJNV,âi%1  AyQi;A}dthVw7/~ja]R`O"CTzX`’VQ,&DNRS՞scmMk])}N[vقt4rL*DÚ" ϝe/H &qcNsf)1(kQpLG0FCH+C1ty1|cn%UqN^īȸQ0?42?@0WpեURaj(w٩xɳ[) (* g+/E8aH&WzzX!SP"O*g"\^1u^k~*aUi LV& -`/s x0%_øI}'T(284>i8< ȔHY_wL͂{G9eX͑ *Mb~0c Ji%} qЧtVO.FtRySxy99WnAFSb.̟A_7]_'+:DZhmխ6lkKp!42R"}8v"dØep6N~VTOVUT3;~W߿y{y~}sLɫ^{v`&k#˃pM뮚bk4-T|v%-FմҘ[ OKIMQyffQəo>d_r:qulJ(ӳp\B9N13v<0h(jŠY~(ݖ$<Esr(c @'xpr"#"M{gۣ{5//y+Ž':SU #M m. 9nPPCJp'6Lp&纾6=D5FS$j4KrK5I̓*<$8;IV8㮜*<)yq1ŻCчG?Aʐ*o`0gA+x?VVhۣ yl.=L@.JD·-[lprVn L^ t98cbAҖ&*h)WBxLB뢋q8eO5aqûޚ U˵D({[=*?ܺ%ls@3Q۬BuL.ǃZ| :",{ 05;e5XcSRGU?9R | Ӥp˥{ҧwc3*g` ߂~?$t˻7ݤAs=oP.EM>=^`aG! WۯIUgpxG3-ӛEH ƝB`ztxyHJ;$s͙D95UvRَ>F`gSѲ@;-gNYʼR)@%(# B,m°s&4qN歡l<19]`{wl%dä~EbG]:rpxyH.*v1e222?G%⣒[:b4e}|(KuPK|/{;L~R$1} >8fh%1|.^ʞ]!"w.7uk0'gwaۨYmQnZWV jJ֒21'0[X&j &uD9 "<٘,jAr[xC.0S/uv!;_"HcR;ĬW逰 ߜ <bF7f[B9rYeysr= k7T%V7ͧid( PFl0*iM}57 ÏpD_+pE/>;kjNkt펣۹vf2ʔ+5MkzpM 6%Ա?@@"*)<((cޫXPoj4FWf=D-jYsXBT G#AS1eTG-DIKu֧cqTQ( Ȣ`;X+$V"D4v9woBṭڥ^oZ~T^x>;(-xٗo{TÝ>~PT7&l%ʢW: Po< Pj&$sX"ot,텉r0I/!B@&HG(v`Xʘ{--V|aLVV+lE`[ߚ>;g7.*{iTLt[V?3w~Ut^2cNk3D(S h_ݫ_zΫD0 "`-28MBH$ w I8h%`M2(:o;cm2T豉h Y!-f}>_H-~n3N=:,Ykr]xFS $\;6:\owT]W=M:aZ~6+I+,l$Ehuթ3U bڽP:eS*zAqG2w\<ԷGJP2ujW ﭚg﮼{]湒!׼]3:.s6"KB3L+l$sj) = )IH[q%_19/j|}|ҁ >Hڜ(r#;]%8P#abw<ҽ1XqPDC\GKhYQ:J$gK,-Lڨ0W\X0valYw)ߎHO|Lib)]+C1#$rgZ82#! vY`M'%$9Q-iNWWW* Sn2D XH*j`Ha |0a%EJ1 p᭔H̊݊O=XEmsKNst:'sʑo%Gw'Xbs"8(X$1"QZ@Oi$y YQ("10G @Op).}41x#32x5sf"TndFnddl'v#}o'8|/g!ty5I3nPpi8͟9b{"s@y8.A0Gi*@uJib+ bT$:oԴbx0IItYIo CHLBDNQӹٍv<%桠v6qf'>y[l,m4brDˀ 4D(4"qQ(}#88fB3XE;$` D0 >0GV4Sf#g7VCƃY璏}dFDrB"n> <7Rd ^`{4I<8 D((gF4Ɓ3uP2Xo]Y~PJ0IXs6rz 0pqZ^g6)if\'\< !)e*La^;J E7rp .,.>. f}eC>er**{[)X8c #0L>/އA:UDޔVeS %bZ'OTm| ArAD\ : A)Hls^PL,z,B+ OIXzX0IǑw@y*a9$L9JH e;Q!ƱvZ+n'8wU\8.Rar2ܗP"' H WXS_E]l-_g,{e,<HA9' QS%`p4$\.%6!`BL¬1jc2ΔZhe^4f#g]))R`$#*1$S.b8k\VJFB]ل; -nP|* p8e蕩l#ET obi;谨[*(g_FtLtkĸ(UkEJ30~+aB4RXc(tt,հC "t.&o+m n :m>v: ,G!ȱnHJ~J֑S!m(vDp+ʏL \%jUR\@J>"JQfDW@*QI \ўSO%'{xbD<-\ݏZGWpEOpc*!GW`M\UCD8K+B`W@h*KɱUVCD'zpC+M#DM߹ǏQ5AD9|<=eTwuSk n])Dy[ qBJm+w4귳_F/¿@.yuo{t{'f~UVUa?;{h(& %/5ӨXIڊwW=4;!ʽtF Pi&emCQ^WчJT9Q?/5N1Y1IIpB; WN\ş)~T ,Bot/x~):/ Q{R{KhSf%k,V.9y'"}߽8&yY)]am}P\8P -#~!p#X`*EN \XJXbҥVtGXT}q T 7Ifahٕ[b +}Vk^mME|-yV ^wЏs^^fRAe17/6b3_'_XqR.o=(>@Xj\&9[g5f#o:#3/Z{i o*D@tWnEȲV x}PBzGawMgS|FWRwoS| >x8t'-@^v pрeAN &uD%+F XXd[.F鮂PYg}:Ry?u.68.(_if%%Z@JF/5)E 0 }N &cd;֧='lCqaZ^2;t2KtTtm7fF)Uzd7!75{[[A}^Hdacj},]o|6-Z;ԧgdnZ)Z# |!~ntMTؾHRvsTՍeTmqn1)dnlAny^2'Ԯ<6;v3wWm?5&ܤ[u_OX,QLJJ)`i.HI]- GЫ݋Lw ;&^vWM]Y,/}/4uuM~j {ak3덻 ~1l N]\եF_ň};Stq%YFKnOT.SWQ-<IH a5 RJKns(JT)!fH YwE%Kf.T~oUeY*аroΰb:\p5\̚]4u>µԵgP~ Vݭ4ε3s0-bKzwSm{= J;_ʎVXѵx-28CG81e!Ili%*h9KZ h17p_POEEQjS+iɉ`~U,Fn!]ogUmaFf&FTKA~Xr8C0'\c$zi J!ƚ*$' ˸u$hńK*Ljd4R&R/5eDDL ` Xy$RD3׹.q)U/Q: ߘ|V?l@'f49ۓ x. 66k~>+G~5K.b=@Bvє֋ ,4'@֯6OJAYotqH.^ na=/Ml>3q<1DxqV{2BD:h\dp4恈ƫ˔jdZz>g(F68m>S^OwF0 \jQuS Sz2c&c_x3)b8ȑmj) t|L*0R]wd S4z6% 4b|\h }PV9[=X"~ e}@y߻%.%R'.yߚdh0_L.?ިsf*?lCozmg_ݟT]>3V]"h*1`ybxb~RR-9@%Zv^ClPgq2fj#!ymt$WWdEY{{"4ݥ.tfAuy3XiFbL}T,l rthv뺯u.9{)Κc֢o,:nk1͊4mikMKtx7ݦ8o,45t+/@1fxů-1ٷ[l>x -JKHYxm IZ21#r!%Jͥ|v`>BjtQ;P<zWmWTXPV+krg}O'PP9l__= )]BR#HAA{DkH0:FQzuR~fkq[liܾtEDsߜb@{tC\xcŮqE7z_VoR Ҟ4FoF+@犍^0͝٬ө+KZQt5򰫹&=u$o1HGc>y5CfaK8c{[Qd-k s d0+!dN69 5D2heaGp. 0>-a#>] %0gHh]zqZ$MzؗAcM39:6wdpouo:X؞S7󖑻wt^zJIё}W(V3z|eqXnھ3ۚ`얘J#Cb(HG΂϶Jf ^Ȁ̊ tdREH` !gH3XnjFMsw|Efx$/4ƤْX)&ZO$fg~^talx1!5>zkcCg6 B~Ar1KomDAʪ<}D _ 9JEIoD-Q|(yM"˿WsM[|nhS*|dda̖ :B%^ 'eHm7.+EJI $#uLqYf~(IBxd*;(cw]qq4Sd*kuv(Ǜ?UGU3i7k>Zqy'_(^ON;i8¨6Y-YT 'xo qB28f,ϮXS--E+r^U|_V! o,# aqbM7q3p!jU튧 ~' [ -[ m^L:#QLVdvB#@Z;?Lӟ&3{7Kƣ^ܞ̈'pUu=MGi#,6oy]| K^zW/ n 0O}͂ b/]4$$_AW̑LNQ(F08˚(ȚL «yjvn=a~|> J!xN+A M%x[hd*Eu,i JKNɝA4[;(6@fߓ9Ņ+Ͻ]]yU˸8Jk?M`_ĕJlmIkVVmuxBmukūw믾qm;gɃߝluePX쀹L B9*{踱˖VPeZkb .T$n_ ]@$0)=$zualkٱ X)Вg?2S~6%B%H*CQ8<쨼 OKv1]oa2[|^>¼ken 0hy!s~$)9 ψŎX]~ |}LcWYEou/}P^cJp99wc YKҞ aԐBt:T '3̀h2`@[vÑ5tl;fNgp}qQs},A/.Oy>h"LCy,tu&1%rQ$薹b2@Ҕ1x  YC  D ),9H2duMUH;8q7/ϯʐO٫_qBgx,_YJPg*.7'}6J}n@!]r@Zw|gl{F64(N3' ,{8oR"[CH*T J#/a,L54(4?%($"a'%iSd-*mR`D LLl`aե1m&tlĎVq>=صǛcr\=740vU{0{u3« byY&]0|8k媱 .1HUK(]GغwSo~26xt=b;9=8eK޾y mM{ WTzo%w3Yy^OkXcl䐅䕵:2$|{C9dS0XkW|35H=w|#y! bi:kփ? %O h'DL<jC)l!["akt:T-l(v"36KG,9 AK.dudq ئvmTZ@ڇD1&/ ma`H QzDbtG9.8`c ߈V:a#Moh*^ַc/1~OT:^tPv eq)^-t@4G:^Eާސgv4t|WmӇ_y ~[_37H37iNi@s:1W (Q6)Щ.X!bFk zN)1a(e0Ge@.R* rT_.9TԀAc){k>)4`ǯJ35UGއK#J!޵:5!ZJP6"Tf!JӍAzMll j rP$)CM6$H7j3qߨeOZvHjR|۔c6lg6{U6ʏ0'TNpJ1BrQL) ɐ3+Ѣ+)1>0-L5ӥ#Ĝ.'+RTkL5c;L6[;ЈQU޹͹x ͌yy.v:ypg) +TF&2.`ɶ&!De F-( uJBRBGfMmY kEm\9lJ5v<5y(ZUkZ[Z{D9omYTަ)")ﲨ73440 )e䥯b H`PhٱI,UYaϦr-s 2XVB,|y\~T*ƃǮQ5ֈjԈFܴ cL7%j@A +bE4% " m iP ;q6/&tJ31EoJ31Vtp HTBEe_ud8[z.*U/zQzqԋ=|DX @ ( *mkGZ&E,1J7zqzPaձ>vӇ{Pa ݻ_57 n~|!G4 ),]U҂>UJcaaiK*+d F]kgU]R8S]iDؒxɾ)2B}oo6F'Xrl~~؞|lc%Q.?eQRE}[rfέNL$J(=3E[ Nj0{_;Ы߿J6- u.7\w{̊So~IsǛbFG~٠|;9JMnhnڍ4yXio7_њ q4;{$۝wk.y sive5/H=vj<e6,'y9񎹻m233p@/z-G5?'s+9,)gCqĚ8Ѷt3E[UPXwǬ}R%bբJ?#pWwYwt }fQ]e5z:ڶ.eoJ%(QpSP%egM !!鬜%QKysk^L[R ]sV妐ی3.Rҹ7M#s}oilE#VwBC(BI:@oCP-Qom΄1hj&_g>]mJ>n!bi j+0Zjf%z亦ܕ)aN",=&3q5sƖjhewT$U4jYЂ^ቈfJdURo3M- m*F)]hm(;eXDieѥ FaM|v 4fèiLU7*v>oW: OgqnU 1Wmh^u*L%;KT}3D> R9 A:^yr}4xNuuhaST{HdtSNqI>I͉ӈscO1Ǒ֑ ~F_o3dT&΄6b>iyZm$Bɨ|W!'cg -U0ss`J"s ]Vzk#Ҋ IeOhBJvlt)K(>7k4{4:*y9xi@yGta0P߸+e$Xtypl5M9;ɺ(,q4` HJg>bX@?5WHiRTL`6(:?ܙe#I:- vP6B@F=5(!:=n +Z#q2F[A( `VqC hA}p0Q tWhbA\ JHN+ zsimt8!a%s#vV|zᙆbcgG]`10xg=Cw /MP˦tdUm jmlAhqU +0똆'z$; .' ) >@(E&rZ#d^S1P>KÙ.̓1ZxxLPBb2$kuk+w<o 1tXTuaQ%d'>PwNixmG5j%eD!v"}IOYp 9(P"QPG݅ZR@7P@R@"pc2G"<ыf W nk 9%bkVcCX DAXoBZڝK,brjF&G/C̓pFL"(يG+^ca,A,J >g0qQcCe t׈Bj(crf:PF.OtE{YdP55Ԁʬfz86Rjv譩+ UDc,e"EIؔM@[T_l5S`ȵ6vٗy`/GS|k==?:/2Ɏ7P7 Pw]< pffvѶ6 `-> ~wV8Y:Z:5WZsLڌQg5rFCo bL=p id <%2`O~@rXr* P.(7fho8w(]tR"TP=`C(uAAJ@25di3 A)P-vlNA|ow?y xЄ:Hg 9)E*aX0jH c,1ICH4pY/xp`\Sclci0Ih j hN\7Ɩ3(WjҡZ4AUJFm:?3Ag2/HqVPXG%=߳jeנhdx57לAPK}4 ŢW>ڡZ[*`|#(AdJ wbdzl=?-Д nFг5>dNҋ֞fJOQ섥PJ↑$5oh J`Ec67a\UKcb`XKEv,*f-Ԥ3V IZT*d(<} \-j1WZbs\-j1WZbs\-j1WZbs\-j1WZbs\-j1WZbs\-j1WZ?Bp_;1~/M* Ѩ`]-`Wte5JfW& =]ǰ+z6i =bq 6ٖPc`N (SDiba|q6~3#ᇍ9g]l>_ f`ϧ;?U6P~EwzNVt:W1st9:p}ӷOVLbӷ-o1}[Lbӷ-o1}[Lbӷ-o1}[Lbӷ-o1}[Lbӷ-o1}[Lbӷ-C6}`\tv=qmr{$NCt2'H@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 t8N?nGl=>_7?学jCv{m{ HlVk2.@c\\oWc\`ƥPjqKjp4+Z h5=]1J煮(g Y-\h}wʠC+~~losXPx b毷j]~Bfh[g* y4>7ʿNowp闷;y݊hiKiFy=M3}J>ъ CXϚMYh$CFkZcq=ᦰZӾZ*5Ra5tp-mNW@8trR܃gN.~gz\׮6~'u Z}AWNC VDW] ]1\BW6=]e^hCZ]f5tpɬ7NW28+ck+jjb;]1[[p芌&WDW OpK-EW 7aQ tut )&`jzZc~QzQWIW6QM?MY<F%uGǻ]o󶕓ϻwz l1 ]o2d O7'Q\ЎQ=9Ƶy|r/MgW֣o.q8@p]y |ښ%E-y|+v,ɲԲ%'g&dUWdUq'`飬LqXtW{vLc7݄,~>īSg9 =3tF:TOmTn̿]Zՠwcw8ߪSV3 Q?~c`)~0sHg[E20beP{Ͳav-%r6)eR1EYDcrMYۙgfY-fdǚSE3ƈˬ.ZQIoyA]fY 7Q3-9˜R2s=crG`/T}mOSM.TXd<O&LG@tgɽߞuslNyD]lR&)cjw&Rr]0p`Lzh]aK.;!qNG.V~qV`JF%P"҆Č mB飂UQ- JТm>-"Uy3guMw>i^-tt1Za"ed\zo|"˿=*VM ߆Ӭu5a[%QkJ*A;ۼ:bm1sI "`jNh[5N%PʶjcVMT֜m՗*qS`Nq r 9SD-;SDhKTW D'`O%0{:*Qˎރ9QI^q Lғ gM:ZpuTKQWb˩爥F\]=`.#WUWYD%#G(Dvz̘&f!z2*KةD-ǮV]uE'`MD.ç""dܪ('RW@:!c0+NF]cWWJ"[ucuB*BDTUVcWW@DKTW!Tsc:pVhɂJpkN i:=D%j9=v5"մ bZI2tbk4#?L,/[>Thzݯ+RT弅ŀ_u^λ+;tWf, d|yҏBg/w-}Y SY|nBq?VסRa xOסC ԺO U ,~ \>Ipԗs[c_Bhz}SߵWodb=JZ& 狣ɀ*bZ"+x1Y%7Xnlldy7su0ނb_AZZykLﳅ{joYxX, ;tgh.P{uoz%8 > rL`i xf͓CT˴ΛA>a~yqmꦮ 7_xç=+o,헷wW|2J }ˠ^1dْoNZ*1|R,cqR6U+w^4u&6qɿ]K|^+]Ʌ,(gVU~Bíw)@>i/df`yg^sew+Ni/uN5 k1WCw]+FcC&#Psyٱt Fyj)MG~~ou~lZɯ9 mʯ\7,8\K﫿WT&Rp4oIͺS*3[ \;ԹXp=2pԽ;vU$~7|waR9ʠNxڐldL̗ MN+5>^FW^M5rӕ3v.6/b:ӑS+lFaYF"ÚF v WZ4\:qs .3<}c#|(C1X/ :xTqGuDZ!h ֚1H5ʚ =Uo]5&X`/Zܟg}.$[_7Oc߀ wc5DYzMV:ހ#&Y-hDt gިFCe2VGXs&=ۑ?Ĵ)K7V*e@C2 ") kgBHgQi{2OÑTnc K pW +ud|]eC|yvx 'EģԄthN zsb;I4 İ  Y ịUh4 62("rD`p'%1r8h05qw'+_/lڕkl²`WYDSMY*u{ǣ/8}|mYN=*f Дθ@**TsLHۄLȝF@&N N R1O',/'T61Dc hX)88 cKy*Xro3ø` ÌƆhclxйf8,_o6}b, cg}`,`s:iUF͟Is3u,!^4%f51F\.ZQ M-l=PLzZp(udObg0.>idC.>G.uSCs37kJ+ omYc e & ¼&BP5;L5^sc%bDg.KRY8 CRg-TKr, QgXiWQV Zu3c ,S1r`G;z9@9k)o*=Ϊ|@B<陏ߎͳUNkD6cgKoM_n4%Pr[ )Kx >q‘ ;W ? ;6 \ GinLsXq: 03m}8"@,GgD2.㑟Ni]MgHGyų`X gU.p"C!v?XMoGOIy<q~D VOrpkߨ_j:>rjY|TIōp9M$Noa3){iWM;Ͽ_>|~#1PZo毓i Y>~bquk0%js e4<8[v82O?]j~+ئoIk:TXf.S'AE'岣WuO;X`w׺lZ7pkݪ)ryp WfyFfju-#jM֫\^}^|pjOsܷY0d_3!>m.^v{ɫnnigܰjG_[`^/xYrOgc|<1V0xyP =F<l\6J\dG]l_UB d;0^ݚ$Ϫ/ɉ&\Q(RVD9eTΧRQ\N_ACwmbwcMշ\lmՅ>%瑩1Ȑ-%d18 $STp3w$6]T".mJA168zK y@'X<=?#\KGoc',߽l64|oP rE_F \i[mK|޵ڍ7^1|" `**lF3lZK>hH*`%{*WOIzk=$у+!bƖ52[XVP0"J*ޢSmvҕ PBdvT1+\H}Ċ\QUR: yb!X_Xc,Mg-h vMjS yclx5řy%2Hც)>nn}E'}i<$o{ :.ov_+M,#}p¥Flʦ0&gƢp:Jddrp k d lY`Vt,}Q!x9NPׅ^|8qW)6 ^*sȚJn; xLK֋(.* ì&sx'r)[$ggv) 8plj+"ArNAqR!8+,%gQ$.(iXl Ύ.0fBje}då ñ?Sg-:.GmC*kw(p嚴!|tO+H]& mvKы]x'gL쵍]x3;M) |k\y.J#B2f+0]ujcw[ ]%0ݛufoVٺ{wuW%_%y|T[>Uwq.֕[ ^Dzanal]mH =w<䏓O{ vl,4W >J] r̆Jb*oG2͵3Uٵn]*1ӢTb*{pư#Ӯ!51pgZ:b]UQl(5ht7o6y X_lwdǃm5:uh4̊U[ S1xar*Ldn,+.ԬcbP\̺ߑL𷔽XaTό(3%5"@ɑ2T\p#=Z$436g;36'4ƅ͌=pݿ .Ei$㗷C?=h/T~4| ?3#3vJ0 R|CɆkpF:J0'L\E(5(R|LԀѦ46 p %dpE(3%¢dT̶vcc_ 6Y`pimpDlfd|c9Ƅğ JFE\8E|X>\!WEG-rMBLFdWMd Od*:!Vlo:̇·RO!^χ|T ڏt17x\DޏJ #&UޭjQ 7=h(,ʢ]Q( ܡL!}$Q37%aYY}Ѯˉs$ qM!D3`YeFR>emTgYi)xP 8+)g4؝2Z$>'á"d4-myhWYd%:0r)E5E|i5,'Ryƴ1gүS*rT"Qi P |$@2-Sq#mOYvz3whfbf^[K)IY: ~BN.9y=&ʽU[?\`VR*8dH<~㥗, ! kEkt%q%  +kUW Ѣv"Jt JQQ!"v.g+tEhn;]Jzzt }~Ij~pb*W͚F1oprj1ggۯoOfWg]q^q"Pg-pk6pT7}0&_V:яq?/'܋ǁ6iOƓl,!gcFTTlT^.%ۅ5Fy`;/HhZq0VڸxM+>JMGKe^ J%51[IwGiXvcћ̊Sٺs ksԬQN~шh⺟0I=~ . u_w|PلEKn;_z}{syuoXnpyCv*^>{N~z~c׽bt{Z?fa|SsG#!o_ݱ^ ˷ߜ B~w{@/;;, yݳ5j>InVB p(mm_oh-{ {[75{v2eL ͷ.!W_:ѠWm]TK6z42Fٺ_;?5N%ߜ||}$? ,2E"G9l~xQ)zx/"C 'at?W]Oƙt+M~]|])]uʚ 5&>Ltbp&?n/3ia;2-} K?l7v])]ͤ=бޭ:B]9`@RCoN5Sͤt])]HtI0&+u/rJUWǣ+1> +q+qt-]WJIk1xLΊ33Xkdn/3.(iGѴa4xM+žjh:N~$])pV\FWS 1*Hp(pGWe])uhhJ0R\挨ψ*_GWCoOu),_vu5y3iaI3teW]=u[-at!+M~uoto0RGѕGuW]Ȋ8;80RdGѕҊ,]W䌱QW\f ])0n2n])J)9ժ/+tq])nQt+PZkW]ъ]‰ gN|Աy>:B]Me ]|+ybGhZ>_WJUWǣ+vۑt`d]). ])m\+ݢЪ#ԕw!-Y_bo}Jq9јJҪc@hyȉsgƞq<~׵3Z+dNc$?&e) ƼzbGu к·5u :l1k]lr;Yu7㳆@< t`H94́M?Jox`!S 9ˬɏ+z疮+\2EI</@}Jq0gAk2+%nqvƧQtZ]jt]#O  ,tOjj%-i#sJHhvo2!iqM+nQ4 Zo_+%UǨiVv]/m6nB7:O[ln[%7!)Di]Lw#]6j$꡵d}wxI>XpFJK?1Wd_7'#7 R}k=Ņ&'nh67~“˻Я6bE}@ݾ?7|zP'WmKgk0zf7oY.F7o`m2~y}=w/_\^!X_I_mvrlxor=qǧG|r"'R<dw7_|'c\ߋ5'⻑ڼv6߿NYU߱Fe|\X.yDn18Ӣo|B|?s|`0߁>a_!_ێ|{OozL@ypVgiJ5i6$j E8) ?m6W)E[wL}^?_cq?|_9z~ܔm5FpOShr-& \B6Ol=Km ד>U_Wra:&%Zt͕f9|̥4Wfy\g;vuHo;ia#Ojkd܉Xb=V%Y}sPL*J DU Xn$NfTP_B,.E&!իR튥i(kM&߁l=dRG-n$9ғGC-xLf \f䘓A7TVѱ$S-9[ @x -ٽ~{M-"I3Ɩf# ?ECøl) 1i1Tj@cVymRL13fWw&I#%c%J@B3ux1O|EM6)\ҩYAXRJ!RC'?&X9<ޛ;ΪjeZO{ ϙf0f"If Ė쌭)@x1*O,ls'Ω9eq,Xl1*1 S5y.' jXR" !jgqD~Ɉ\ u%ZJH;a1luRشCvf2Q v&UI9YbQdBn;a Rt^{;9,!E >ֱb'thbEj% 6y0>a9- >„(lGn r-ӄ„MGPk$d!V}(iW$#ql5P.XLMʺWhƅĕLc"?K+5q^*QDB@pb5>bQoM}PPC*"|lV{C9VQ)ZwB>a'WH c- ӧlPQ&t,pm BHx%[',ddQAqi K}76ۉ 2MjALjԐM 5W2DKC&K[X!{Cݕz- gȸ#3>OP COXB", "Y/"mx ̾:Sm:R\AZRНHC7a$?K*k쀩9*B` ;󜸅Q% A4a=T@lD2ـp Z􆄠*Ttg ]`cQU B!^{M R,(ِOXPYu qit7U(Xc4i<kօB}k Y9`Dz2;1uF 9DXlݙ"zJ(rH ʒ (Hk a'.ƇIS= &NdPg<J_nh{ .M3G&,NO+ EUMZBp`ϰt_ Lyx{}ӯa͹^_oՂYDUzi &bm$f3:dnț#̉KDJWshSզU Ú4<)` XnXQ|^aQBE򠷒JX$JE51V^W1P>DTѕNvrzxI&(1Gt_q)k(!"ŝ26HcǏ>.5E:Ցcw>uAbyExaQMZuY 1H CiMuyq.`%c*զ)W`>=Ċ6#AqX]ژmH:EjC,T&V PK@_QJ07)RmLCQl^W _V9e$څj6Fsdž:P[$1ސ kl.X֤-k3Lsm_!e6ZEz2n0N0l nqLR"/ED%u{SFmF̃t2Б9U_s$`X6MW'1,:QM$躓`ڢ[H7j!f=0h6:W  mځ1uA$Zkp6cjFCm bL9|*3|9n!/1oZHs1c4G< ZkV)1ʥP4LzB#RD' HvQ_^ u&ix z1V+*-W+fg пƐ$ <2aNXW(aQBaT(8#>t {ԳROc-P,׍h҆Es8c@sJ6Hs#SZ]Ԉz |,J>9.J]3)% &}*@$M5%i\4ܵ5k$8ks+uPtBe_awUgiw (﵃̌pᖕ-t4Ƌ6 Ѓ n4BSqc8|gj67^cOZtQ!,kJU$󋊗hA8\D6j͆bUUtDt0ebP '!"!2v]d߃*.0J؄p^4]D Ax3!z c#2ի&B7iq:/R0*?Ԡx٠C,@)gNy^95 3jz6/\f. v3-a~vZ6OK:;KM_)8a~zV r]ߢ֖촑_ؖ -Wk,cPH8>nPn<4hH2>&Vtah97Q.xzuSĢY]pt6Ǖpȅ=`=ba0=lV46/w^NG&{8]3Q!m2d>REe5\kCۭp7{\\'fj}Jh/CpEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\=)p%eFpsJr܋`\}m "+"+"+"+"+"+"+"+"+"+"+"+20O {\\?|oFp%WPpEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\=)p}l++WV\JK՗\9hs"+"+"+"+"+"+"+"+"+"+"+"+zn Njw'`Ż'5uG1Ma-p'8ߘLx;5$rx9a qx?.Q59`T4=_ (/\.sz d\HM495;ZQ E=4wiMqk7&`>|;[^pimO:/yw6?ֶ1cr6?xG @QOoOg;GI7.w71DŻ]~\t/;PK=p=XbWIMY_R*.7qXN5~זkUYӋX/?-=dfwN][-nl+Xl/qGE:ŹlK;rV׌gWn._]n&KT]2[J|8+ y!kS(Ҫ:w'8+ӯ G7S ](Uѐ"+}7kŊjK;}tΫ ˒).StᡥXAJzZGYb>)x)1nȊh1d1@d4U\KtLĽd:Rx4 ~3l뎢!% \\FVZXAO;սMth]b|#f]֍ݽPfbHj4?YaR_Z_,:'u9#9m* 9*f4 k1;8 ' 8$nHrUhwDPCS&d:ès>_ \ovkd\0887ca%(Y7<C;iDvs [ۨ37_F}79}mAeҭsnޭi%jnvj\ NBD3zz:<ΏO5Rq#f^AY(ruf™v}k_g'0|2j^a :)-|l~ͦ0ifkcI;z[/vcrd0N~8wo֕;.׊]%9$tb6\&?k-Q|:_*zvYx447;KRW7]j~]NUY~zYLAuC~Ŭ/߸(-5h?/.6,OpGyo߿_=z_cj`^]&H& 7F}}VEKQq}Q}rotj]n̎z:?$y4KS||fz|2.r9j.)u cr}+0x8 UnT6q\lGvn᥅we6t0ۜo+l35>&#'c$]0 ¹E5B#- ΊL)(%$h \UZKsm}a||Ak"(5O9W5_rBb<9ZeMgVg'|wfS~kj/s疻 hco7"|PC*xc27sIӗyr]I@SvvԶv "to|gGo"mR\1ae)gJ߲,!\d \b[,d3Ft2{QhliFj ! '6^9s\S>j\*h u+S-~:fRxz wlg61E1<9/y68~j)7l~8g#1?O|VZ2ݦV£1,k3ZDcɊFicD7,P_ikN6STSpF. זR:\}! =̹oYt:r=} ^W8D-qM=DXsMZ=6en뎂 ԔoaF:i\1Bȧ !5!f/-|T=Q([ , Mʛb|֩,ͭNO{&c-BMuMI+f{s]~IN'9\`*s-vcu\y媲2e5$c{.c 2!LzX%@F'WfM^i}ԃtVibJ45'>*OQ7ߓ>Jh~J:;_q\fٱ9nD ) 6=SNocN %tFY|q B5Rꐤ%Vh>(`GdS+bOXclh͜u/{PyKx3Cq4 KjEJ*=P.CR(QҨvmb_7\*RIKF%uJExv-j\$'HXB7zV6'l]>7Zy0=I1M]6(PLɥppnW-N%"Xlstc%]\L2 K;˓B2DB'.T`gD{CLҫhLS[|JZea 0q.3Dgg>$I BWӏK/.~zMMOEq=ۢR|5j=\o=ߣ›C=g?{SnQu _|Iۥݳ*ڼs_KA\8^J7Tm),Kos:QMs~6ł\[~uGŝ'띋9b-)o 2zruzkv̊۠n^4#*?ZTPٕ(@b @֍׌jŕÉ;Uٕ:9m}B~$TVh͢LFև(Q ڌYh8 LȖ+tP^[eSRGǭbݼAUIuh*胢U5S&{f&׆ 53XidV&&/" -SX 1 8j:h$Bqך'W7 o|UUQyw0\)DPpd8&0 Ug"@ښC`h%=)_ ڜk9W3Tmd&ndUaaq nC], o,. 3Ժ|zc6w@'dfsőKR|Tۜt_BKgUg:Hf!Z({4`Q`Sj5#[". p 1eXK݈fSP1O͎CQWFm>`l!%V6`p&b|c)IcBҟ9҂PU0+m| Vq !ҢLI&-%H%=.i&kG&nos^1(h(ulֹl,NܧYif'*4 .aI@FkYtֆ"̙YL&*QzLe8HSb:Ga#r1%RԝYJU>gm&c`OȻZlEqb"][="ïri["\Enw_h}gl1>dV]Mтʕ-vL#ɇ[~96NMgϔ"=jo{V|JOеI,3=X

4'(!n!7k (*w;D0MOWoêӤ|`)J+ew<0њΙNh1tqYuWg`\e~MIs@Sz{ys06Y;@^ѨeĒ>3?[0IUY8omTMР%#(S(勃.X<{'~@u`NlӨ̣fNX AҊs7. X>+'ɸuuo͚ Ho$T>mU>gZ e_8oXϚeGvii vBH aF>v6cteȐ I $e lRCБϕN9Wmc=Qr|Ŏ/,*Gp%L>.$N;i2%T"b"H HYr0,8 Bg.DV&nȬʼ8{ތ>6?MR ~>&b+0xXn;6H㊶۲E_K,;_Ų+T!xP)\qQfMqaRs\3*.\ pD(s0 ˽h3NzM9^KrRs$t`nBn K̐cVSAS(B+/hݑ4+H20TgZ0 @؟L ՜|G.Y `؝Ѿ_0{FMz3jV1oxѶ+RDдr/HƌqN%irBKk1\f%3WYGy3X|zzg@<!dQy0gE\cʧXs (T,H̞)i"n@*S躏28` t8>0r~ TƕX^ U`Ȗ?,ő*,w O"C3z$œ̢omjѳf5C!Cz)ٍn`GY/񲀻B|T\Ks$RNb7>D&&̭a|!Q\aўFZb@W'!a.bv+4* pzgz{N<şIk8Y{#^ @Cƾϰ M|)fïk}CDk|dtN*/}U5hXyMz X)yy{0*@.F~1UZ+iDӒ}}ZiިK}c6, Aŏ6YROWxn#*f߈SA;ONjO$e DSآ<f8^:kCYڒN%vq`^ ]5j;rE^xX`Q5P{/t)uؚW5. 8EJ9~AiesIE󉟦QL"RB'D$NFh%䦄itrzn:'&N9tO+t=vR?@DFeH$ 5,0f@Dz-<$L.ɘ*oVAެ^%_}2Q0TRt4{2IkcEFThR44_!Ѽ-]jK%s=,߇/wt>Xu4fz`j{SZwzSJ^Z~޶K^GOJ~bro~F翎.feQ`H #f>| Nh57X6Xּ #Ik8;u cR-ů˗dyŠO1̾OKL ?PGmz{igo S ~WO' 8 |H(9) | ||܌ ޓqdW2"-}0ʼnIABc#)6ţ)v bw;PN)93Z'oʄazB&vL-`NP&ibKN:CXQ|2*ةή dWK- +dUSaW C+6DKdW:%*,ɰ+WN]%hgt$(Eˮ>v%6ݙY|2î_ tHΫW<,!+}V3NHᔛ NFSD қRz $TϼfM1dp_~|/'e^& ,ˠzqoy 0"go{~n %}n>M훳J&~ۅ[32py V X\!L^B} $CZޛ9ԯ@͘;f bz%Yr`׉nF[3ǘ4̪t$h2B#68ۅ.'q6{]b¶Ŋ *L*(L,$qP*t~/q)] rL4є6/˶<ǃl~h"<`kL ;1 S \ B)Oc.*5X@]KK߶b*WU VX>߮Ü_g eC/fBYigsu뎾<{5 #,vE ?h!a.+;e7=$Mu)SK.&!z-7mRheR* h.ijhuM[ЩjtF,l}GK!S1f4j|sWL#,8ɝ}ggbgqIOP+{_ݮ3WmۼoUgR`\QYseQjjQn=V(3xn2$ŤUQWu*j5_!UTF σb#BX AVZ3kM+lu҃y .3#4ݰCyw5R'4kV֔L!5msԣZa±ՎI; X GO&,+񩄟%h5?3΅g_NQ\'kıIHkbk#|lh.nuXA}4okӑC_U P۽4GLcMU tlz$,֑HmH!)2:۵.)Iw,y¬ak'L3X Ég:E4 !r-ch30*#55eB 89HG\[,#p o[7ʨgm9vb&n!~5S Dp78FLOx<K8&@bC,I5TОF8舀CQ-eul{5xѵoYO-i9ygENcx7! M& Y Ѡ Q-i{# MwK- z8:9NBԀ{nfDSX^$q7 r41iyR}sT?hjb]T-jhvS !)CRԴe׵odCRL<p5p#09Z6=>Z rr/SXSV%8n?9? >;!Y }.e^A,Us5cWd]aaR~!ͨDvU E`^יLv{Y S4mpmyf/jLndAfFPi(faN.=Cf6f%C Xk/]nBw}翤ٗ[ðf5`5 6{Sd䞋>Fm7B2زGIll'ÞTbq!\vGy$،FnZ@fZ! L.4e$>h{AYIG43 `r"}vl: A<N]nq/";X )Z$VR^kr%gn+!dvKHE6(_xOo$3θ}_gqпޫBt7)9t{@aa0D>Y"hhf_SQrySj "A*u#b ̵sa+tJU Hc➹^cL|5Sx bx䛁]Fݼjvs#;EHsu{D 0jr&،K A(j ȓ|40`]umDeOA,mвw;QzO>YPbJ, qD׺کS^_=FrFRq{h JܐĄ1yS嫬 \q*!sjL>u|5Sz53k TL5U|1s07|d1{  njArnn| F=ӌG&1~3fLφOd{U49Bc"qUZ(w3^wtcb8+WĂy &ե9CIakݛaz ]|/1~cÍ,I W+4=6ùꕳk\Xq1Y2R0H9E!A1 0i;C!@ŜՁ+CWtMkd&O__d߇4cfg)wځ|駆ZWBJ_O,pr:iFpCVq$2|NlT}_8RRN+0C^;t.ds3[{s>=7:M~+aUi V& -@9Jjp\aߺjlU`ި-trL?:}vg &R(FYB﬜2,HBm4 9c0RNm=9q # s˘|-N4&sb~M|m=R-̚5mk<}#qָkELބ~/d X8BkՂLDpΜњ*⭅sT v5ǐѠ\4o!-gNYʼR)@5(# B,?m 8E8 Mۄ;~?=2%ӯ?[6a"]C :LIZoJߟ7>ȉp"5!պRD=Mh9R[$qBXblE(kth5Z1@QAsgQ9b0XJBQEL8SJ=3A?uA{'K__pTC+kl²\cW3YR9˲j6p6T1Ly9kT6"ާ2Ί/|4YV ߌ#lEj~%Ko1U  Za?* ,[ސ0( ]\JfPY!H/SKgvMeWR\Hbuh)x#[EE[~lQnt[da_l\Y"%+'tB* NnV6Iόƃްd@vYɪefs 1k􊪛2*oWJ`޸#2$wp)D3'Nȧ<^#˩01s` $Bі<Njxqtlv@`9+8tL ][o;+>nR؇b3gd'#oպQKBrFD٬_UB*c7UL616d B2c.{["1\E9Y: BOrMZЩ/UihC9U+3w~}\N\uwO'ooߣZͥحyZ~vk֘l&'tBkى T9Z0> d\kHʧϠLd/]v0a^Ȅ`; x\2i?n+"/$ Ԣ%FL K&k "mZq!J]^_^_>xYX)8qoC(mS4*TUIġVY20qEfd.3rSIT$쀯&Α^Ⱦ$߼ tj1ͼB2(J gIRZ3}PY<%;Ckg 3krUEM%}$פŤ; FV<&$FH%NpʊZMÊJR3#ZfMY m(z9Uj͓K):|k'iA(NsrЧz;S[y=&}j{0\)DPpd8&0yЀ*jxښC`cX̒ >ibB3xPs-= |&9J5,63NX؀gµDw_*9:ݹ݃lxe|$ό؈L3HT)86']8K.yA[e9zm*bK+30cϣ`2, lJm@``CbKe.9̢8;L'eb^ jW6|@H!%V6`p&|c)IcBwogiAF(X`iꋪx6>8L̐YQ&CX$IؒSR,H IVkOV0F* NƋk͏SQTFD1  {+FzTDt 96+K}r,hdD" B[bARAXi I.kCJh@3Ctdc$KZ!!]M#2]'zԁpqtuIjT\qQ8*q\\REdzֲ SV7%": eKx\<6;NCp{Δ}/vJT08 Y(X;km"Cц\t5q;%Q94o-fN3vq8ܭ?y3~u!{ zPk"8P>H+S̗mw|_rqQAm}獧ťDIzCE硢Vtv}RI]2J'0. Sb, sI]E-d @BB 5 +yS=7ZPJҴTyH>B2.+*Z~:ʾqͫ)Zu˾ҫB, UgA)a)EDX2J ls&EkZv/8D! L;̅ H9{o*Zf-V㻮9a#^nb"1$ߴ n￴nwO9tfse|f!l*oz4>ygVa6o=3J@=tZ3>=SsuǞC-iyO_w$Ni-2DJm- tC-imr~>ܫ0b%{&GktlWSp>콄ԌIӧ߷+rVWB--."2B-Rj;Pߠ U9ڞ? b0RW\*ykoV%#Shu5wY}7"0liTɂAQ3up,r<8tvr|[W׍xcc/,u?FL!k5W&3Cw5{]гs~8냋`ľ93~d%lgc{] Y̐MRȐ:M(e8 i]v'_&'yO?6ŏ͉ȢWB94Ldณj$3@(u A mQ}AʒۀgI:s 2qLGfU9r{}|lng߾ppг%5X4\ۿt1w+Olˊ.;FME>VM#\hÀ>?<91x}D8t;ᠰ~JMw=75﹟}*;:-{-EM *"dT&'{2+e>G66lqBπKłt^왒!2䩂<8.QFw/"qB7D3%% *C/-@ @x!24,p1J*CjѣġCB=նnaWT_EFvAot~pOYD#tԐrr ٸ 26gn ?O@ c($%=r$7%R̮@z% F\SYZg ~esZ%dMQGY>Nc>~x:t2*Z"n r~WF6>"~<%}^GUM'%hà遮s;VJ~m1-"vo /F˯=Ѵdrr6U8~Sذ6 H=ڞ)gU^OqQݾߥ}7zFPs5>I];/&asI anV.tL0nuJQ=)fz5vwu]|[E"}]a^A6ϵ;fr{|ŴNtuXh 5Vf,G@: #fz-}^HlζwԷER_P_JPfOɈmS{ɱ|8|SWjLf (=<&͟.rZ/i6(ϦG`~)p7.'黛 i4˾|UE#)5zOoK[Q `b* 2tڂՄ1\KOC.&-n` \]h1v>O{wv/-H$뾧o|Y_~ΨF_v.}Kܽ#Ǵ`^%zL4?{"0Ҩ-bɺ=]1n3=9 e-˺}8P',gvKυL%N9׍T _FDqE%3޹2WZ1pIriq ֕NWvvsSqv ,R hgtDugSޮǂ4,=2'Ѫ'ϝ-mzpFne囫wM"! @NHbbN*(c΂QI@ |R& s0qpW_X?+E?Tzsj7'd|)Q`6 n,zI`t0D2cjMDjjs?ٻ6rkW|).бb (rfq٢@o<$cmdkI{83-[#2e,_ ˑķLO EA\k3iAm^/8c\?EkaFn+-pƕI@5T"ٲ*rU'P幅 *3DC">_"[N(lKyQP W$3BcPAO ˜M=."QpDY> [Ei/oV:\[}Y`(7N 9O[Vꨔ6,e4A#*dPr:bhPby;/^t|yzs'⮒" ,]39\gՈ?DeT@$Z9m Xakw)ejg=A-Oo i,b;6rzٚ5t.+_nO!{>}KP\C"DCƳHIM$ZYN$hrOQ# 6e&o޾;ijTG[/nE/:J[Ӛ57Oo 7NphgŽyxd\}d\ERP p Z pr3g8 {qݡfB1 jKyGR\1jÌӌA:PӤP$RH}@P06 ͎HWRy*w1;i3o[|1@$"$bֺD4H!fVHƕTSrEG(RΪh)Ex਋\8gM 0Z3&Aq*qgp=W`O&䰸l y_ N^)jfWe^̲M6p;󸁘uv&&^^gjW\꟯_5ׯrwn"aW+?;?|P9;m/[\5W0~xPkm/YMDJ%4p^-S9]\]G gFԜ~5YzuPWly~>m]uGWxm mo]r6xN^t$xmrzm[<ѠwTu2h ?gB +A$38O+AT_4^?wqZ`qp9Xd#a@hǟ^gWo a5rz+W}}+`ҁxC‚p^!TDqÕܑKxIozlGڲWabDeFtHb-ET$M@#tR$J@D=28$ <pJlxP<Śs7_n}J".{z;W q_Ⱦ=66ACϺ nIֳe^1=~z)c?m\H5:6JBe0=lݑy7i/}E7n:2 .5vhWOq4{+/>\.z̻&=l vʷ9}{VGX8&뜧٩\8aq NEDL)eIXt^]1HAoPb^Ip{T@$KmOa&v$}%a4$py)l/M"UZ]JpHHg`Y K JhH9UN(T!NUp)'sÃt<>^Homsx]NK8/\P>R:J%6hI![1:TȜ z.%9O*Aυ$;`r1  g_Nx0wkσ`*߽wci<2mϚ~'?хG'Wn1ݡsm]:zH뉐i-~g<¦CiItX1GUVHŸYO C@많;2-v'+C5 I)| iD a 1 &'yшZ\J8u1$I[UI$%Fzj1gZiSi5b͹pusڧ u>x5őyvGyvΓ-s}-IS{DGI%ˇ̈́Q&`(ðT*V7eƉ#{M^p_5 mP.Q.$TQp_bJH0Ke&0AqLυpGEP fT6l*+<6ޚ/Rh8bG1bAUdvW F`10kI%xTT&m {/TL"Z1)8e1%4"I'툤Q9+m3j++ct1*!jc ٝp[*.FHu;˱E^iTaxY9zǵCʤe6 m.22>\ݤe)Ww~sCFɗ3<Q9*i?IQqo#Zo(Qa#U:@V0r!c2Hh5h$Q>r*CgakJ*{<̓9[v8 ~-. mV]݊3# eX%jłD%X) A? 2*}iT6 !۠LnL )&s Մn˳߬d i2Zx/jMjF=7@ۍٙ+ttΣJoSUbu+<uq =&yE "W]oV ñA_u.7A@:Fuh e:Zn' (~zm?[('s* "'~k^GR# IhJ aDmr1aR/1"0+냉KM-a$EV 1Y3_GCiҊoݲmH/Q5?^Ov9/=xz,ئ8ޗ6=</1kƔi"|H嵓r ,4't'!N!r0t \HxM?2 YK#-{o8{@M%T8 y Қ)) lNJ7Pc1g;Ja820F&4mkvXlq4 I/ͻ"D p (7#S?[$!9{4P hww#q tN)aX .epR(3h^de,o쾡Kzϸx_Zn_us#[VxiG2G8z N"0 1_@rOdl':-'g*'q]7׋3Q:, (9ԇhnӆgJLqJ{Hoꆬӹºiѽ"+Ѡ;lл6DOl0 i +Ui3t=C-]pOk]6-v﷌p W|~L+rHjK1rj`RX:Gv@/q$Lo`ke 6Vm5ݢ\XuE(J)t=~Xi. 4JWfz`! 0|0icyDSRrZ(w?-NiW38-ii?  9aT (8EEH2҆fL=-H`f]D?[ -V'nrV;#H &mٛWh): 4(w0gJذc@H"U0qT0 6QO0eЩஞtQSQl]ebI2z>b7b`q6*~)NĿq8:ߟޝ~<{&_ޟ}N? LCḉM"`nWlN m CSZ6ߺ5-zqmJlBّ[l Jd|{7?'.ōGh?5A#'+gJVF~UU]<QLφ(%]@G=7K!APh b~H̢4Hij`w-(Eau>'-tZ٫xb" `{އTʍ<^:,:qт [i~;?79\!rAn)1H_w&]׭]YD z#+NKQXb- kd\i :wVܱǣrzZknUv?}㫗vȞ?R(A$uIҵMwQ A VwlJz{#ƫ) P M^.n/H Eȗȷ~: EBW܅~SD{p5=`I9/Rތóȅ?,j,*%K/ @Y$XX[ goя>t~ p` K<[C*Us^NlZ~}LJ lN*w47a=@Px ߙތ|O`o?gvSf`/f 0{pcv7%pΒ"O~"Gc,iseLsuϮ]z>ka-d$s͙NjsTyo 2c{96c-^mUi˙S2oT8d"HF0 "uqF]yjo2]vY*ѱlr>77޵q,ٿ2ЗMp3r?_n 67a/~ڊiR!)EV)SMiF#Lz۶!Yۨ}~{<`'"OƜ+ ݚ&;*T6G홴ec-)kgW͇µ Kq09RA0LJa O1a"mr1^gաߏ$=WkmIdv#>B;0OdܩL?;?0+`"=P{cH PxyV_ls%A6,LF;jAY 1/` &jC$Z *,^.Ddd'LIIDOJzGd:ϙ$AQ4E(4ZH'ɾ7'#-b> %rKq*C9ׂXNMTX'JaJERщ(ri*q$H&[y#8$qpRD8'j|3mqzԉ||`;۰(B@B#2g8Q΁ᑁ,bF(J!$1\tH2s 81$R*&XLXbq$* XxR,͙_gf%̷Ij;M>.##vDf\Dge>ar]rR M%q ^'f"6<lwh \*/!(f%Dmc"^ˆ]L&fbviDZM 6P{`;m6#vO=3*|#1"I#O&{XQYQd_PVy%Y$A[G&G/H\=DJ *6uܲ[(0S3S r`$4 xp^TIέwg4Yd=3&~B1eVjA Gbi=*:\R'f:x&bRe{'P_(ZZn9q7ijhМN9֡96>ow=;FV/6݂;A;i܁@Ⲹ;p9(Ҿ+ըio + N2ԜsW5F;չM3!}U[z꘴{}Ǽ#u֢AG'6TCo$с>ipTȬӬn|\ݷܿmIc?]F8x_^2?b3{^UOKnPo2~7|oͲ#jef57}Wn_km1n[= FRہi=ӢĈ|U\GO7-O.~ǻARwvZitaRddp, .Sg!ΊwgC8.웖 -:׈KC< 7C[79{ ^K^o8^]}qhE~]T?.>j~,lբOM Z-`t<[~A?TwS{T% 밸ͧw^6V7Jԫj=ϩa åOUhאIB H2gK!Y1Pp&3`"9bVG5ͅsa=76p*)cP IQ9/SRX#gB]td]zyu@RgDkm;,'r!a<[λUys P%қ@jJ[;oimqqA!K-ࡆ/ar&I2Q A2FB<34L{ :Dg<8X9",@Pc#ކ[4/q0K@-nI>D8'h㤄 jJРՉE$(ךY +<"0ǡI~0oE 1C;z6U!3>zTe)(Q*! J(.T6551Mx'ߔ("ujN8K)̩NZ?WRx0:z.|BI 054rgs:% P)1ZU&s{+/k3+9y㘶׻׶X4TOPA(fZ&T%6K+DCi/0VT6z cS.զ'U\T߮s͋".AErB «Soրoף>#jJkf*.`pΈ\Ro>ߊTsQQҨ{5&V벷LR ?Zռ|&㸭rAE{zzWCzoޯ?`Z@P ɉ<2^=TwWWkc9YZjNg:Q/?|,*:N_}b3bSƑ4f͇{vYaO<ț9#{HaTЃʖPc6S{Ȟ+ȵ޸ŵ]3;ZR1r\ RPoMQOޣQJ8oB)dz;oQŮDM7&"3Y/߯㏨G"c=#ǂ:Щ4ueJ5_w{]6LO:f72ת#ⶋj~~^}֑Wg>qN79a Wޏq AnN p>FУF[ JЦCmϳv $eĶ<+}F8o7z/R+TWE?MEtMVlۜlwZδ><;euyP(7p/pUߵR2 +䊒UX@o*+/p՝W\pw>'1_"0#՗՗I{ܥˤ43ء~G ,_KUUVUR^ \1=+I*Bi; WYJzpŁ=,Z2qMoC.luR]DCH 5@Kd[[PZa:o ^ \ FNgghMdvNA;+<=,#D,}iV0uR0r`Z O>;XY\juRUDRH+u ek8UR^ \i٧x3DqAPZIL*KiW/xcdrU}u˪}}V5nR⛜΋n>;^Y]"z?P&3{u;{#%x Tb}8,F:RtLR=wCZJR$;_JIE$QHpd )Y,8+&AFcTSkT1HL451s%QM9KIg)fY`s("9OoEgvGS˱~ruzxÃ&\ۛٻFvW<q,~-/H&/^\F#y%}g[-[Ւ,S\_bWŪoM6GYju쁮Ug˶OQv׎rhYE֕.~5/$rH_Z7Ժ}ҺY= tqӇ6+N~hRˆZ6}cÝw7Zׇl'CoMϮwIcVxUb~u?|6v;z?QۡC3i DU1/$rmn[~fŕ&r-E⸊El,h f8IFI&sR^.'RdO^nR\i"k<Z"7T)*Ql4Ry9MȾ KPW.'R^桶L#AyI6+SR:; W)f) DMr3E3 ېd6~'0br<$\Ha 2*Ojz-DK o'[e{eaсa0۪-HP=9O0/) =6["06s !eTFjFl? RPvڬ2jf)ADf"z R"\Og<0Ʋ0(,0*f Hd!3dHVtIDIMPD cd5&AY)h" !1Xa46NYiy.8JiLdIN!)e!+#b5q#MW ȥKuVӒSqQTE1​{#07 heec #BA܈!@ 3-]79pRvX4<<U *q_Nя܀գ۩Q0r>]W=7ί ^ 5,rC+^/LA3H_x W$$Ȟ6ŲU)kGAhH :HA]6:d@ ͽ E%Ѳ0 JhKD8GlOBl2?6>ѪNP$zH/augGw}IoP1FcPa ETLFYjRbC魡8B&VGQ-Fc5q}NAr93EF"ORDD4 4!Bq+cLRV)D6J4kczZc8D,]`L'I^m-#)ᦄxfSt2'eIyjY˚cg_PZe3) `LVdr  E ,a]P6/}Sv^/`*n=ΠN@DبW~)q쮧4Цh? 7)|mHӮӲ=Z^(Bm34Bߠ!;7u9;|GzֽΧ6.#>4ݏ7vEOݩH+l:>SFu *gw :Ƃ-۩, OF!߂6*0QĀ!Lĝ7ѬޭW&ӗ_7i4̾Ob>&MԮ,"~Mܢ }W濧_iVovpjV>?M{GiX=/ta>M&ʛA6Э mKہiRó1z5] [4(v1kq27(Va bōD7y@G;PWP]Ǔ|OV? D D).4R'+le`^$ {MJƅ.R,Y[Y!!0Ϭ1&dVDϸ) ]RO>oG$qܓzpA}BtW#JzlEp>E}PT>L[2}t$-ڋ/IRZCݔn%FdU~ݪ]Yw,3Yc[N*X X~C/ar fT uDR)nY.r0Q LL DJl!g(Ah1"_4e}kp5f!֫=c"\qa] 81`Nr d&D$T@lcrtr #cKirQA!`=*Tѳ&GR*9WgTDӚNuIhG]^ƐhWu]ONXH7an0m(,S SE>WZ>rI֤ dNj X3:˒6bL4sf%Az4h))5Z8:50 fSkwWFsJ}K\o_ mToճzlў?4C|D5fB;HiA6>x_҂⽆7'A`Ye2#*-$37k!%غ/zgqoP>u eAjoz e_qX4]7 ;\vN5TgGF+g;3T2db$kD4S^4LZJ0dLl%7?0{3A@8\ZhHz 8"ɠhTVgePd#H2F'u,:[.iBQ Œ' 'uQBSp%yGx&1 xv\_M#\h>v#}rqWz &nOt׳XZ~j`CMzgz׬+N.>9De;hAe{Z[+9ECsK6Cfq9Dnl3/o=3 gC1C<1gI\'zʙƒ#gBqH-"v@*S OzW"D3a%%s2D0%*x̵%f}Y/fҷuZcG P!^Dj[vcX@jXBVK5@L.BiaksGk|ѭ,,jFoGbȶJ&9O=7Qq:FYn&F>8ջ)6ƺJH-G%诫?H MM]bFl#1d/%HBtB;^d,#XO!O:rqX;wB]~.t<񷾸?([lv$|Ab]=1O[bE4ߧ')t;s3HwD1lMns']+||סnkxftt]%gV!ΨMUu_ Pk_PUUuOD<[U?֭d #AKi]a_'G9 >{|u: r:JNLS= y*8/} A+IL5gΓـ“EYwٻ涍$WPMm@L*oUu|aD"rl_nELJTl g{VTJZsf_[zY^ر -`+6IFrRR꺨\k _Y^= b6+q[))A젰0/Te7IIN*PLJ+9ڊ\_|3weN0(kJers٦ KwVJK+h!4a'V/]s-zn3sD&%eD"2'Ju2߹AB 9-볲zA`6'ښ:o(̔LV|wPB^N"HBCeXH.]URj3a.h (Y}e]LB%ޑ*5{vJoSx=Wd8k9[EO.\cRg]|QfH > ̳RxC{YĭkļL} Lv-jdVǙJv5fEY$3sr$R?QO+VٕPϯGa[!fG]u}wѾMnit2O1S׳_<I+0٧Iuv|lߌm9LguU4V{VNG!6ƖPo|S3jc3Z4'kZE9D#8ϊrn<=?9-V94Vꦾi_zŧrԁ4}cz02V Ɖd:?ɿz)=/54'_v:Ӌ4!:z?ݿO~xT˿|hF`V.۹ GC4VE{_5ח?hw4lnѴq]z1[;O>U7QdTڭӰr6ӫ>/|.|kE| HWyyUQa ߌF8/j q-@$iج<)Er.hR6\ 6FFeL&*AsE&P4R!)g%g F嚌f#W`k{rgÒz|aE=;6Y(݆Ka*dh2seH/@f3Ntzٜ1On۩';[(oyx|pZ52<-CGĽZ%M?^~=OYiRƥHAR#bJԺJ"tyRtH4yi3'$7'MΓN0Dt=Bx9ץ#;!+r -~rSǏ\߾ "U' iy`}LRcQOk<lL/\^`g]hoaɢ7jCW :j2xўYf=`ƾt,|=b"V~̓;/|:I6,*HT>C=Ȝ,b`C1jѡw(,}1r\H5N*! ,8I{FEAF0;Hqoytzl#u#t}HهE mtק2 d%J4-ʭϋ"XQ+0FK+iG7d+f[ 0x7ԃLf%`JC2_rA̤J` EUu; ^g=u|9ΪMVE8CJChWn)jb,dYJRϺ?98q|R&dԗLH[\CA&cHI6€J3ܓ eME;؞Lȝ,?67F)Q_hP-*J6~c+25i!`燾ʔ/si$iwEZ@iĘ.r2L<.xOe#v6 e?w5>p,v. |FOƺ'ŠvƮ{oPco+0s3WAhpݟ\ǫ]D ޟwq˲ ҂QV`CIBuO7+>u&!F7هzR0N~)%w\o7_l^Wn+#V㲛>ӚNM+Qw(_'w:3Y̥ yڽ{q?v\˰y\x&*R(,2ʫ\ {V dU^!"!sEx=D>{>o3u${޺˵NL0fOû ׺XRbZ+DSb3J CJ#LFDO[t' :ͤN9=MY/~rfRoUoK0tk| Vx,@%S#N}0-҉f?~]&8o7A:%o@#}/vgejiS^^D058o+W@+qLH+ yVW&{WCf+&zd?gHtsLs~or_ j?;zغÛe6HXz|;WmEvɞ>{ .ZUb``'j|׹>FCYb\@ i6{ak:lm7ҥv4i6^ss7/.oz, )kwgtvq=eGq0~~]onun #jFwE7|!mL%oR}%2VQjʠ71#5vISnyv-eDٝ̉cN3|`ۿMkJ8/iZMm^dwfk8ھ,ºZzo9i_9 |"\47n>1Z~P.5 782&bVGCW XТD5:2"BWVAQ;ҕF\؝f8?hTNUFO*D豆hܾK.Ndc%Œ<:`Pzvz6/\O*ʛ -eMzu~r|R(vlw3FJW BRgW'u_zӋ ibt zlSZΛV=%Nո>sq\z+n}d5A]Hvb%Ni6.Ljoy|׳4 w/zrnup,(kǵpU47{(\ GZ09b:k b]!]Y鎮h0NWUe,tEhuϙΙ7 cҮ0~#+:""Ѯ.Fnwbn]=Rb;+1B)/MW#UK氮vh8v+%]"z%H]`D4tpBWʾ]#]N FCWWţ]fD:BBi: ]1Z}+NT@WOCWZ))c+!bc+FLPF@WGDW豑@Uc #%16EQi]ilE<4pe4F0wf(>B#`DtŀhatuvZ{Mtv%[ЕjWKZXzw"F聮Z]`l4tp1;]1J;1EDW ؈h誆c+FeP:BBJDW 'hAFzb:BZFΤ]%!DXZPnۦ mL4M爁j M3C]Bi@GI,ՓjكeZvES뾎ciC֓mtU4EHA3L~j'fՁATj'daZhAFdb3`1XLlBkQMl]`)++M,tȟRtut-"+"k1 =ڡ_:f}K!>ՓЕ7F+yѮ.hAFt(ЕRj$)[Vp[|TK"ڡt=tZЕjWK|Tt%-+  ]1ZNWҊ50"bNDCW Xj%NW@WGHWNdDtŀuEwd*e"X#)7趣9"bi9a`҄+;K3JK!K`#+l\4tpm4th}Pj]!]YI"}++},th' R@WGHW;]`q1\ˎNWR͞]ᵫ66EyXjPPJѳ#hAW0ծ\BEDWXJ.DCWVcQv@WOBWJ8etDtEAģ]1ChPP JDDW CW X=t(5tutHvU*c"6]!]i%DWXѱ+Bi:J2?$Zi vm>V7iՙfcoʹA3s보XyAز"@/+h%YgFY롕ʶnTqhkOu @9k eZZ`Vݵ( #t-Xe1nD}+&bw]!]9R\LtE=hjm+Fٷ ]= ]y. F* ]\ќ3Zp"hk1χpKH1?E\<ZsKPڞeOt]*z= õ>"]]oGv+!oHqo}0dFK"cQapF}O͌\K-R{6HaOϩ[]ZNWBWOΉDW /\6K6NWR"]YVօikY!tEL-%CW(tDZLJcg)KCXkv_L-tBW?fڻ۫ qp7/.FD HzB{^]]y}<6L~!obS.vӈrRoOk4:]5tw6:Gv:RNU*wnr~c/|0Od=}݇~;NǨ]Lٞ{uh*P|?RfGHV=}VTvb^HnP~c>{FL8 ϋ|3揙)@?c!ߠ9zoӻqfתk H۶r5`7u:AؒCIcЛvĺ]Mgtq]H,W!|knwd;>Fo^ηHϯ[}vk4^_{IFjgU>XGeIGEs`[YjIfUktNZBn3 rTkŁUST>0v:LufcHy=uJ[W"DcS:B;N&uϚՂ6'BIV&o-5Atlbl5Qͨwm{թmM;-=4͛TS;sX;+j:7@l37OԔ1bN",=vX BuɌa8h\\D\L5CёBE{))w/pGDk3E2¤V'G@ۘR:Wvul§U d*CdtC5< i !ƬbjO( Q!(}tBwcSz3IM.Rid|HE-iQ1'v!g5}0 ԜIz̪j35Ժg9ՐJJAu@r=꜌%xw:>Ü|XMѵ5b',"8:AOzlAoFl|9* X )$ڨjC_"$\m!jT&b>iyRl U} ɨ\S>Ec=uk,ZadgEʥ: VAfYt7H!Q!S( ٥fݑ#6*yˌBi")K>#(<ݫ N7h NAkah:vζiPTJV*ϏgdWGWHmluXBs+]S`1uV6 ]K ƺz2?s%Ps l[([$XIiޘuȆsBE(#{*%*Ze@Pɔ2)O-` \:Wl!R5RcM ؑ(+(\d$ QAQ)tJ৒2 VHPcLB2m v^Eo[PB]Ѳ܁pYWBnTzCZ dܠQ ({PDdBEs INq1<B&nx$$XYu0wT @T|B|)q$8TR7'T j,T r ֑lA@7V!joLEwfV )ti7 U BӬ=Kʃ#%d~GBYOQzcy_]T "YUDI)bu+եyy`3/M#HH/eJ[ԠˤDdJ(k eq1!+aMh#ǻ =sA <; \{tG^̈KUUW:E$' 7**WeH;L'Q"_06s?`rvkqYs·m/ׂ'UЭ3߳e_w ڦL[oT^#TYUl!JrdAh2BCTmC^Z@mD|n<:hЗ9t %pvk ) Fc*2OExg!nK)%lKRjhNJ@k^FWHPvm!򬚑croَվgb )ؾ(Z* sMO =lC$Q55Ԁʬ!ug6Rq뭪U},e$YIX@,/^m6X=USVy64Xq 7tٞ/`tߦݞo.ۉsݨLU Pg.n4B'eKauh0J˿s'U2#bjnT̸ךBԦ;"$6b]f ᚛,;0Ry/aUèOV[m!wm)]^V5[9>:z谯 kC3 d@r@Y8~ N Q@O (N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'u3 cb@@k32hq=A'VH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 d@ܢyAN8` B 4P'St'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N q:G[`\qGO2zN 1jqH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8胥(~Ȇ7߯KMG.al^*vI%rx1.q1.mlŸKzCc ՀkRju@I ҕem-bj< r7 r(HW/J]p$p^ ]G<[Է2ӡ+fkMu}E=# ,W7c(NJ;O.-/+Zx=]:4R|L?+Y˟Y]ovC+zsּL]4| X/i8,GUSj)4=КG_V6z4lbj?my>6%/^m=UWK*͝wyrݝ׭ޢ}yN Zﮯ./iU 'MV{ ˍ8Qm>m}wYoVϯx}o:%qbKnY#:zu o]r9aH61۱25MjHJ{c}yZEjp98' Ȕ^W|N~2!=[4W8SJ!s>Vu0"y:cX}ì%v:}W>;}Xwf3 #xU @F0EU'x( `{9|\]Z7Qb}-\NȋjZ<\U+M=•E' "vU͵îZ%\ZEW_t\gWFp=\4pGv=~1~szEpm/\Z-^fnS^wD.\py1pUURZklኬ({zpUwAp%-pTW\{1pE yvUT=zpؕ"sԗWZT]j=\C'$"W\KjY pJketeUOґuFf1lMMo~0vv;yB_ n֧}po5%~\i;^Y?&w6ldZ|ݚ+ | z}m#g ظF4&,/ Pj`sN,J%p' *lRYl<Sˢ}o1k$ڵQh3P[~4] kZ Wjdֵi]_h-x`-b_ͺٔFlclfY}u R{^B;{1 E.EVkBV;T(jJ-`DզaM[%E$5J= Cƨx2ۢ~C^Kp ZXA>s}ۻ-[l|+[ڋVtјD4Z"yN~DPoS.jAxEkJ͞ tTrXEn}9eup*u'VR6eݷ$6W^z8 w*dsQ|r o#dM5lq*# ,KQ^Szi{zyz)D] J%f!u^d.{CLh953z] 2c{\t猺HmR!8@3DKYe3s9Z`@,Gɷ`ߋ |cq| 9UOW֧׏p5^" Ʀ:+ ^.^W=V٤kG״8~γjN?_,~iE?-WO9t]fZ֜WH%dz| |?捒d_Cʭo5GϬ}ZHF?/cSz@kIJq@A "6[B{տz]Ivݑ'^I[ȣ\*c ☒$E5PVcнdo-w@v66Ӕ|{gr' Y |4k 8> =_$T /M'jnkb i.V&[^F3$F*Ap19'jE I)`YaSH-zXW$ K̅Gc[oCleaQڐ]AfTQ[3ؽ6̫nSEld^oO&+0vFӷ"lGBF]M J1# bXhK}v!'n[<9yZ[V<Mb}h]BA)P 𶠯+jFo^yţ N$Scc 2{K mUgP|#w .] 7 BfyJ dx1r}gt/3N]1t>s2DT9n\J$EhMJRVHEgG *zENtkSs`KêM&k&b[ofK'HW]&BꙠ#7`gEIy8wչ%>,(DC#I1<8Ȝ$WU_coeٛ%h6"1G/{#  oǓYgvD 'A_n{w̽cl_quu{h]lS™R*00ɝ%E `H*'<>eGp_0|WpoHoW˵PJx;ЃM"!"#Ņ'I\Bnu=HNa`N^"O8Pf X{Rw7c89Ƈa8k@pGfT."<'>|xmj9 w/~75<?v0jлn {oq=u<׫l҅U%'Ocz|mжEH[.4\l{6Lp'`-/&{q5d/Zd^!-7Klr;,-'LVd O ,Z+N׽F94b,QM^(l UAl00ٷc))@Q*xWtj~ `3^T2, ( "w,btEmϩ~*]mxF;1ym-TAz' .A4K5z@F6GO2M^||8Gte:{r+Lol,zoD /FVGE M1.p)uQŜpBM/%dO^C6tB)%_v=YLzIg &q;xAwA$Yqa]ۓgf9>m&^%QnެJv/ N2IM#SYzm:v.=;z#vdh|?HtEy/؀ T kYֆȐy ғ3ev>~}g vɿ^[͟sYuEA{^ȣBKGZ+*%2H2:DYT MyTBe#$ rQ†}Q8UY1r}1/nfΑ'Gv;6_ԓ14ogr}uWxVu7+|yZRL= ?.D=A!@`I y& W^,r*.,PrCm19 С%Q_]aNkH\=n- 1%0$R=VSI7ɩ$j/iޑ5Ah)*Y<ec`hf.B#csu.7p|]"b_M-ly=5<{JV9Ɂ'gz8_ie{~Hd'BcԐbSM"ijb mU_m_Suv1l(BRj1xkO#tD)ٛD牲f儉g8zRO`iԖDh w4;qZ2J_eqB7'.eθ}gq4>bŸ~ _{8݆XY;G˒zwD[*;$n'JC7KL~C4ɚOJLqJKHΗzuCVB_W 0nf}0=هpFӏJR*ؠw}\m3y4 U/_)oTA6`g`=ކ,̹|cF8jJc05Bo. :^nZ9G@t8RFi7q62[_RN~?= P? 3A*sHwJ͗m7 (sõt:F VkՋk:z{;~[6q9MԸ(")}eCESe(bSA˂@qѝ{vSt猽ORwս P,̺4,6`dsBcNsf)1($YQ@Fvir(攮v"o,!wz:JaI6?"gLS~:BqC?ٷk`]W'L+H$穚_s`$?:v~(* }?)hPSnHGtxΰE Cy9CgI$ԝ8,u^= N[l`5i{C ĭ+\箪ED0r8_衿OKvg )(4ۻ̑9eX͑mTgi1xSΚ6 }0z*ɏRtp_I|sM庐M%55{]ߔ}KoGp$7~eI51uՐpV8Yâa0, z ep2moqZj6r]*zV8)bWpH[ chlX^~I@ 0'( Ǚtջ˟~~~W?{?7Ip)@!, bwO;TVMۢjlu[[ϫk}|(Uz ef[n(z7vɞcM]4^]|_X{}HoGA*Tyo[_P"n0J<'-tjiTHĞ!I7$IU^4>GuQ 28ڧg9sFk7wa\1X]1 jQw,,+i˙S2oT8 "HF0i "uqF]yrd _1O^tU-p>7/0r 'EģԄ]J4MSωz$ь*Î.@W4z5@1VYEx` 4wF-)wXT!S ηWJ9p8fZN8LB[cNQhN]]ۤN BG(gۡ14f1"JuTCxY:v5xS~倫VvUJYFzHE޾10 f1grH8+z(3,u7m8e6A&?>krrqɷZ+")˳sqvo6KHE>ג)E 0ß5LwkFZ(<˵1OћLL_[tn eXh!eG mxwp:G qt WRNGZYqLI{|$TJs+> =*+ͅ$Et8X]\Zҕ6_J(y R[:C"yvL:2]WvBKɑj74t:w豠/]EW .mUB)QGW'HWDDW j]%¨tP Z6) Ϯv+n ]%6J*::Ab0Mt%i]%/l?ZxJ(5s(7Υ U,/LbF`uڡRpB[DXtZ#U&5J)'I E,Hc7 Ŕ>YƋ7BoVe3GǤ b>JjR[ 4HCXjÄ}.Gf9K95|4G&7Ȍo~~4ӿc-78Eʧ1ſ}Zl1n)NLT9FXCrZÛ'^%Ŝ?rHMZPԢ2V5;%5 -M)JnN)M[X2Jp9i ]%?I(.8;:R)&*푮\BW EԸNRnƑ#w CZYd 9 M|H6։ldys}j$˳ǚgMOchŇEv79 Xb?p#u3TZ^dP)C\u*$Z .K}YWp5F IMO]t`AiԍuLNPW6dMGR`Jq+/>RJV]\Jq{6MNɠEWJ;׮(]\uuZ:ҕ~4hzѕ&Y@vepsIzf_^8Q5Z@I&[?9jfR x}騙inN4X.t3ۂ(xj̤;Klks.@)-fgwm~FÓ`8g%Tq{YZP䗾7I.-6َt`DZ])nf%TioQ.U$r:J7R\vJi,]WJbNQWzS ǎtdPiCZ2UWߊ997$g-fp4Op[u5 %]M\뉔aYkW׮zLjS#]ؑt+GWJ;A)}\uuFJ&ߍ])u^V] lzҕ؍Wb/Rڔ+PzUW'+ցw+dPq]EWJuV]zZjWdPq+Mae0a)ʇݫ, ;ʌYڐ`MH)r a'iZc?Y%pFJ˲tM+eOP7#])'Lۋ6-(Y"[C=]bP`'Rܙ7FKx])]uuJ$OWlѕn&Jiͮ!]сMO{Yiή&8&&? S(Ҷw4AWMO> 7 Wߊx`إJ):A]`IJxcJqSEWMN+uBpdPC?ٕμ-Dڹ NmKitb(t`}7R\^ts;-UWO+1/X̙^9Cx-c/'L+߹ȿc[&Y):_*@d9iS7)nԔ6S>PE5SOǤ1igkMl$rܤ8e2i'MkYڲ'5ALAB=S5IIֹJnMOp%T$yc;ҕs?Rͅt ;Oj*FKt+ŧMv_}7וRI*@ Oq+к?#v-ѕ=I)gzHWӀejnyOID3]M+;AWvccӓK7Rܹis1/mOUWO+B_ٕ?Cҕνv5_⪫ԕK)$ߑ,ѕn&+tkvuy:ҕ~֮7I/mt])NQW>w@.L<7$''P/7hciњ>#iօkZ)>AMCV#]89FW])m\$Xh;J:=]E+ R7R%H7KҥUW'D>ؑ83nn+ut])_'ߎ܁M ϯ)ityt5 ͼFfʮQ>rtV]=ZMGp])Ji_2UW'+kHW昺pFWJ+t])e2NPW/ԑv+ Lv&Q Ѫ;vw+s']) bd ʻt+'ѕ2+Z2Sԕ.=O†*MdD{7r۽c.M+Ωu9wiZ v~;p|w{&_wL)˻$.I!kzGGMj=1 ~rCNC=҂zeiAiY)_NpiAufGӏѕO|үŜ~S}{'?_^wOďۗ]lz$<8k?xkWwo@hmo1?v|ue7/~o18$۽Oue&ۿw[j꿝mƌЕ+ ]|9l4/7zöm xN6~X2޼Ho6O.w czٮ^idZBdGnNy~y~$Ͽ&7ZEӮG\MiyW҇5W;eD'.,Āb7Jޘ+Pv Yuύw6ο$gg'pI1,, UWmzt+{KJq+qRJ1NPWct`ison4׎וRFu sGTv+~$Z?q/(Iғ8Dcѕ:׋vئQkW'+ɠE>ϊޜd>ϪgŸ>:bp$dYS\ hZiii j:SV `\ gzUtuf+a;ҕԍWCi([u$<햎_+~+ō\bm0C)i Rd=JS?ٕM\bPZ8ri魺 ]ޟD>?h4L7pLmtW]=)D#])Jq+% ⫮NOW6d+[JJqlJ)_uur UHWp^t.]WJh Ǯ֮f߉b܏vOiV]~%zyp⣿?R}5ovwwGysyoOw/yq:7|=o?8>5']O|ܧuE'7ܖfF(oo{Q;lUvrAu /W/GGT˞~5x#j͑|Fxxu+dOwzSqA~Ӟ g?yl?)[.JjrcͺqiBI) TC*eU|_o!YEG[Q_;nOFo߳x5?옩h C"f1%e .x;&S4ף>?E-0̐ru`lzɥ Ύ/G1C[Bz2Dž .a`i&m4׌>r<b!!xtf4hm D3rאP7T;CAFĕi1l g_@D~ pA@T1T=@~'zBqlcRF4 c)b,ѫ8Fk%9j6jxh(I Z|nLGJev2"(1W;tNQ^`TN54KHe-t7"JbZ9!F90LZی.E3ɍJj:1}XSh)79k}:~A?q2d&MZz+քRa5 )Hښ dDH\sYl( x/,rGE6Hu֌&4#9Yyt2J=7gc)E&a/A 8j͒b =7|at  Әɂh"JnlÔDŽe)[ < =YTj!DA~s0G#r#z89jDd#CP U PQ[q)Բqqjyvƀ8>PjZƬ@jxtܘ  a,9Ch[5e1z Ƞ8KP*y̶}brLo5:J c WMfCiB7A q0qIDŽ 3(.CP e]AiU-2s,f &{aE$7w- Ȅv% }@(40fj0gȸ!34ax#!!P"Y/)<f_T0XG[0o- t/,,<PM;&_=Z`*7ZpI!0y<ɘ% A4a=T@ylH t$ sob4Y1(1A1Ŭ |Ԛe#wBbx i :үA!N`B** "&UwQr5BRR!E$!d=|,z^i$*1RH/Aah7EL=&lFZQِIbd0lMc|b~l[BA 1OA-}-ĥL'Og&BpBBK>þ_6 vLY|=eg ! ߾ g CEGƺwDBn3k'Q-#oC/ѭ w8@E*K\)]acvZtVVacLyb`0&~X/|Qa1. = q-4|ATL21 V2Lq,[CHò]i ɂ+6#en]Պ[r$/:5E ǨoMFbΛVmlr1/k$3 ڄw}"LK8؂ާKk3bˈ%I./9Azqy&b2R鈹% cq1AJA 3`R]koIv+>$nFflFezFu4"oW:uO=Uv1!um60Z@;c[`pAs> .cU ڜ׽ƌ:`'ĶD#[8-mT8tD  \9viwFuz 39($2~)#Pbzc"Õ,5Ƃyk\- 1xX0ut",U҈lls5:Ck͢:Ki`J h3K޼ՠ6RjKo^`V;Ehƿ^nAH6t`_یI'y%Ta. \CZ43kFٰ)_0jep2.k5-сtUdEt $bd*mt1 0;KQcEuA$Zkp61d"FCmn`L9|wkh潆 g:%O/&[؊qE`1\ƒ <p$r i N: ;b]ҢaT( UɈOC!TX*2ІEB;zV9E 6Hs#S#zYn#"(uͤX@0SLGjJҸh Z`5cwʵ΋:`j o:k Pٗ6s%jLm,;^;X: XV2Z / n4"pc8@%z`Sd&\P-*/xA.󻍵9mzِCLX[M:_MD/F,`VEL D` K=(r -0 4&y]*p]ꊄީ"DoaL^!vUM.nLT.7 T1٨V :4h&+ɯn/m̩1p`6cZK%]v ~(bxR,ΦA$F`JYtM2__ R $zbVe Nfst<Ύ#/5pr6٬4Ͷ;/au{G]fyrJ5T sPp{${}R߹O%Pڛ]l61n#YRD%pfe-)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@߭{OG`3Ë́xVӉ/Q %%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R*LHptQ]%+D7 %KT젤"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R}J %8۫3`z@)I @6 6꠼t妮Zg?ms͋7̮LGZ19gj0"\i~$k+wޓP")HE (R@P")HE (R@P")HE (R@P")HE (R@P")HE (R@P")HrPߦË mjZߺK/˄_S)yȻx Њ}l#؂<{[/Qe.͇Nźf8ɗO'u8FE-ΎA7(c5a8x6Lg鸤O X͗`?^ 9EoW5 hJ n120ZQE3K׺qk^ fk9h9/;tz;7u5pc7 QԼ6$qY4|7C]^"T3y,Hyl{Y{ndMmTbM+*g=V~d\9׍<ňV-Q[O$W? c'g#D?CE49݌?ǓҖ!nzZ;d5>&K'DjɩԊ{i60-bWs k9ϝbW+#)5^} +?pUz_ZaJm^ \9+8S{W`e \UkjW/՚=j{W\7pU4-޼<O>wu7sxZY\;!osǥ{W0Xi7pU5z_ZsjW/֊ozh6ny8ΎK鬺% >;k98<C|xx:ɳo>?,~5|AvI޳Fb'koޖ$cWeS%<轌[ǂ6n^t|nm.MaVUIf0 Ymm6trVU1k}1/OThs׼Jn8u29]Wwi ֳuCw<]M:e?jOC?yuMx.߁O&emuM~d[opk}2˿Wy M .v{hqj2x|hVΆy*6>8(t MmNb_I@ɻ ֝ڐ1 ?Jy+<`׻:-Mo>R;EEAlfƽ_fW6߼ kVlM>6Cżv d]l7&zVb9ͣ>G9ۗް^n1[oԶ{̹Ō̑j=WV jɧ㛻j@qWaw[oy2s}]")dZ_7նm}3^|ˀP[ڔgg2gC!˪eՈurdLR,,Jl9# DŽyB^!\AN8|b#aF[wB}խJMj ×UL,[:ubh6#pMF)( Zw0xxy"E3Oexz \ 9CT٨[+⚥\ Qd;8a[,ӓ8k^{>Z Yv9B^ LhZQII}"9?x7tthIFRKe e^t f ٤46q8npzM-ٽ7k::nf]\k"vEwh٢Ո>6hs7˚["Z@ҽ^Hn*¼ԪN6pM29k2Gf_J6Ls0LϞYΐrzZ8aBP߮^ƽ¼YU۞5zpu/Wx7j~RF%< x(!D8*.^GAp,2*oJ/h%YMNy3cli)a`<ym04#D$G:04)%RpcgGf֋,Mӝq%6zM/Q}~}< P2\HxMd^[hfd\)I5]e5@T&o*ZJ8"D+gG"fRtTk$(pS% .Fg}x<MSճgb-:EYn+:4\(n/~7એTH< cWl%q6pe;HUyN#R!ZR!ThV)HT6HmD`N$!fI 'T5%((?11`$8.P r T<28D k|L:^]/_]xrV;<lK<01 -c7[`:i"ALWj. 2k=PįDl婢 APYk8$Ƴ{t'l)u_J(hy+da/`]G)7ْ#_۫|-ܭndu]i+]_ `\(Gu.&eJ,LeoA *x ΄|Yrzx̮rm D*%Bb7gxc4a4)7}pcDE E勜%\|nwȉgx?r }ދZJTtE%j/jHY:G@A\Y#g۽8k?́+~PCV }#A'Il7]m YiѵlKռzD_'fl"~0,^i~ߙI#|pޫk3%{=l h'RYE==!7nFeZ^ryܛ~zcǞ.&?ko 8PRh4U`}dv{C:ؑaD#eHᅜheߙ:Z,q ArOD+aJ g4RaZE=LeKQ#e.4.fpL3sszi5q)XDa^1r4q,="Co~>TxOn6F;`!-'kG^cmy-W:gx,'Q  0)n3=Ǝ%mZq P(+`Ǟã2ȝc{reLVi-y.d2*H hJ5p#DeZݗݏ5CebguBfz(“T(R N͠I2cIؿq#M{7%18GOlQω# axtF:$)tT2 C(;H=9;" lR8.Ҍo! Il6?\TmY/q,0 Fل~p򟾧oMN[M /& β4soo"i&|Np Uh0+8 ף> ld8l?-vrMN.2$Ndd2Exu?[mN W%:|YѻH³ *`Inuy>^ԵpZEr6r!POś޵Hݢ~m/WKU |jCw[>[x|ݢxBi޺4Ϟټ ޏ͓ofڊ5HDdI̕?Leܮq7 Mk'+d m$C[GrHWmða$1f7iCj+tta4,z՟nu\GqۨI6WEQ'{R15'7J,6N 'ῇS[:Ul:=__";w}]}o{sN9?߼O? L#pF&{pkW|> W?|ж8bhnC+SrcܦRj)aT-ǭ>,zqor▛EmA«Scgr/,k6.SŅ]\Z6; VEAICBkлĬґ^"t$S$9 %F @p9w$%F 'q;[ۣ"w#28,‘%OdT`zT4"$e|`:NQ*bl)>8\Pߑ;[ L:nU]D ""3G O ć@|":W!!T9a Kڪm<x9jY )S1 qN{{a#%A2:$'Ձrz@$(2/ߵP2 )4b rl/u[3a stxZu.}X3jaqZ,+a^ m8*BrU%3W:PNZHo\;] Oy=$O}=' 'M-3?_XPo.2 {Ur&bDT*tEAZUd* KL fi&( [@s!P`Z_HYЌ&mR<ԕcy֫,ࣃO1O'cX92)~45Gy닱 <x*21`:!^ޯy/*6ϣ{ BI|ۛ/˼TԊv~1&rȀ2gL k'o}9|>K7mR͇L9ߟ43Nn ij2 *j"{Y)þUV{JauKIyO."523*jK|erٳ)j؋e*o[f_MQ3IU쁝.45mgRߤiH*T^v}L#%QuK-}kpjK97HYp^M T:Na*MRэ9NyD0$ Q&w&[(+*Tځ7 ԐRZ4QIL.bTdH2 D}79a笴~Y0N_?DdD$҈i?{WH0n1_o2yC.-d#Kd{w&OՒ%[ee c^ԭj6)"ct6۬,XHh$F B[BARAEt3FHqR ڐؙ~"R$-Ho9iW.N牖Ymjee({{p I.X. 쳵,+ qh  BA{x0fǮ*CUFyo*+$>Ye0dL]hzDbP6CwSr7t.zAY~% G䆭]b٪N@ɵљ"(g EQ&HޢA"J4U0ΐi +Q{2pp#g9D(Lq ̂V"ZS2&ëiIPyO h}`)rWe1KM0RvҚQ!ޅ PcU H)XdhH'bȍcXdLgAgC6K3>h&tB3gULcb^GM 5΁>b!Ef-)ى=9Bn|pA:,)w֫CޛR KCVdRKfK}[ Nh%w:v@:vm\mr3lՉr *˂,ˁ/I=Ӽe[>V@.Ω9_I蜠7Cnk B&L}qNywbƔu5IQ& #*-ӱ$tμD qsuzRޝqoeqQfq^l .7Y7+ f7l-0={!$8ǻL) 0A$,6(hZUr@zFM^|WAN={ݚg}k[{S\i^ RܸTKB/*/ڱb6>/=o3D{<*39GF?H3^p꾣.G!ͷwn@wWAl,t"(6L5!Y JFrcȧ`HL֫u| ;U`"Q &gI2n]A;F͇[93׊u.ټG}Bp^?v^o]q]t=*|!TćOMC/*ZF:~|_sE}>nGy.ڞ_ IPfDhnEp}'|w}ݡU-5M+Ơ~(NgP P3M=,(a:4`: (̡j/@iD %!&e4NЙ 42}fr_M-Ӫ'cCoS:$f1< _4/.ݟry9/ukO}-2FN1˔6T 2ZsY%0I!% 4׌N\a/n7 /` DttG6e \dnS y$T]z}n K1IE+NJ4ӈS #iW"J=pa&+!͝x+-9O6Ag<{Dpw{jtyF͜v OdvZ M{@2fs*IS{2+ X1 \4g-[<6BfYĜaS>%J)DbA:/dLIpby \8,VjoD* *0nfdKZ^F(,w OEk}=?H6QVYW =$ԣ@Q,<, XBVK5xQt_7c1f$ jH9 Øٸ 26gn ߅{ZڒBm 4(=Z΅\j(0~V|mVEA.Ae&&Z0~$Fmb fsi&)M7\赗2kf/%j+x| ,a7cuͫ_z=2-^omVwxӂ-.Dpog4MtpEcs9;'t}[N|k6(P .VGb6LLSW 7vftL9׍TO} Uv+(4݁VBzor@滤>pgM~MM%3Ħ뗥TlB$ttSͿKЭWK 'vC\3y tݩyܻ~+ty7˦?l,yB]=%%ʬQHHlFgʥ;ݳ]𹜤{D]0ڬ u:m4}IbrvYwwEZY%7 Pw699q;Xt6M] =7 Kn9_AfV(ZM1s d!o l(NQud'折km7'?7e,0eMA ՠP̤ i/p)4(=h͆jeuCoog{vUdC|`<̻bW>yF9`Ҝ{<%iMo[Gǣ{0*D89MpZ63z礍۾ᾘ3Cf6~ٝe1pA !`HXaADGɧ)Ўe Hd3w枆#ˮܽ,3[;uyڱn?Pz=|3SlKcQiƅJH7| r 0vƮپ2 U1wBtx5O"52e &HmsVbl/uO5I=mwQBC4$ =q nM}{6t`~N{&.2/+M_]Ź2TcGgC=79D'1;B.#9@<^oI^t>B ̮r}B!3tD_`~5xҲ)}; O LCQۓ{1)׿߫IzB=O4iAW9T|FJDI/Z4$YK!nu>+,bOۂ3x-dc";ȓ[Hihw*n|a bX~x:@ yAG~S1)W'g /j~tp\^MFә= >lwj 2HhTG4W ZA !<t>ίSwnzF}f6ނ )pA̼F&ct! bz>7āa$~#1JhuߙmNmXۭ.PZ">3\f4sF-Oqw2PTR,J04pt7K zTW8c*Ug%5==%W$9Æ\ <Mf9 worrrKX7Ւ+r:Kn)8Ai '5cE6ies0h9(˅ !:Gn4-=ʖE(0FҒ hH2uYù!H!ipvB4J3w {iefkOaRVF2 u dR-G;Xa e0Χu8E 䚶r|I^'t"G$+R]Eڅ:sGT1gZ;fQqX*,!$3mGƜ%ʕIRB͛e4*h޵Ep~XЏ #k~hRtzex?6I75 e[G 4P-9.A:Ʈ[F~m(^`%iYٻ6r$eoq6ߏ^.3`03|m!d'WӱXr:H"Wd>@^?-vrEN3$<.}d4Fxy=[MM7U)XO,U8 ~q^oߺ>qפhT!vz1v\mDk{5mR_m^M.U\PŬ$Vc]'aRң!=~y%}g&;Mp=!z|1f9_#DDMf\O~u/v޷ #${7_ BVdʖUͰHbV6,pFj=:8n=ٽh Gc"ԡA ?omTDz;{Vz8y%\߇f+w_/Й^mNƆ:/cl[DHKhftYݫqְQZ`(ˋ+jD@`L鈯r|YE~ /l s x䏁E+|.o(hHBHwF`-Fj&oL0 7nRGHŚE'7Pݩ3M`<=%_-8i9/ ELN2gz츉hMd.;N"X߭P+\uyv[T8{ +ToGcts7jfza,ɽ\+(sՕˮs joB^}',]as)zhC2Twެ+Ē&ں}7W>M٢慒a8ozwa3oh߶;2W7T_ УQF;)P^oŎ(4_uAjg(8*n512`e O!;{.2=z,MtP£1FDKh=MLXVmJ>g~\p1jӎ]6-lik[gá-Hx+jTo$!#2A(bL,`ЃIFwmN}N8FȆ`|REd-"k-bkג"aZfx<tpAQD*iGM{Cp j&i$̇I5ER W.KYLKv]]lڰur ehKKƐ$HfT#6B}g2zU@%[ivjEa{(v{0aûbY9 k*1Ϣ'1g3Ϛxxy3077YɁ/?_hrW/~+ou®r0}]8? 8qwNxOd~㤃؜z5ܠ]Nv~- SᆜsT OW\/4DWi%LsUPlZb|i&LZ(bExN5#zm(&A3AIb.AI6:AI ٺzaiN#dH~{1qQ]rՠ J楿v<Ũlv<**q$"-m_!R)FgFuhٵOsҎ"VE aAHA&$V]ȫ4RvSIjjBW!iIT j EwJ_,p"y+7Òrqu|_;y1*2Lj+N3I|QBrq ji4ZN1v@ci\MK\s3lDyʻk݋ٽ@Ɠ/{4rr _l^UFz!RA@SBP*JLq56ób\#'%!"R ,mpLD)h%TIn<Ŗ ' (4|?[='-}g֧NpB|Z(4n[boA=t>7R %zsTԒ'Z*Ü˄-{H%:>|tL-ۈYK-~'xޑQJ[H4F1+=,CRr>_GL$Ճ_lzs0q{Zs0is0)偙+s[s'>"s6hUc1W(-\xsd5W_bhM\l\eqј,nJk4WX;w{Y8Y\(%ެ~.^Myotz7c")6ۧ k~J+ß9#O :/'a /"G]koe 5RtdppȢL>rD Z/lE(von۽von۽`oX?poʠGۓ-jOV'+ړɊdE{=Y 'h4N^"m@>.lƵZUYgL%*DSB@̀;e+٪'+-&gU`EER bwzQ:lf=8ӓ5HDAr?pI=*,X'LjV+a7 Q>/ˁ(iva+'=:ByDq1Y}Ju,q1YZ-=.&KiLpaB_-<ZOrc.oT$@2Wi+<5`nI.a'|I&|_t??lr]SJnE?$ Q]oGW$`7E~J)!ȇ߯z)q(J=bŜfwMwWuAHuLiBHXƭ#F)'X~1)V2""&ZH0<)c"1Amy(1<71ږrut޴~08@4(<76\BJ>/W2%{g0,(@} N"ʙfhtA !z \>^SGQ&P!:A{ĵ2r `q֩`E*L<4/SER`NJ7%0lX0tFm/6K`so -MzUSl tⷐdMvi^7-3SЇ <K8&@ņXF2e24AGkLt(;(_W=<_QE9MA!nB@(11XO0Rmq*5(D1{ӉIq/"- Kq-Ԍprˋ0S$hĤ :eqQW:/u]h]. =Fԓ>N 4L";җ≁Fy1q !EФoxh˼`!aX(%6!G$½o>zuZqA\,l(>g4E-eqO~_L QMg*9ƍV~9 q2bMmT/ 'B`*$FO#Cş}:V4P TY3b$T,2c?F@J^s9f)fzrcF-K{ysfݷtvx;+n_u+#XYJƎ4-;i8 Yr'3Bì,|f윭kƑn"ӲV5),M<fLTl7+ևhyZmZLqJ{Hޱz}CV7-~PRfJRysg ?Vُ |4xd]rkl ,٪*Q ADgD?,|cFP#5h|CK/ȇm~ŅV/)Rj?GUQo )pm1׾Me鞦[&E$sIw E5C#}9ܛ6`:`3$7v8n,LPt3hwc޺t6QEE#se]C⬃@T1lӼpS, &YVky[S=g{T8C+|8!VzJ7o Fc J ]̋0_L)ZSݝ5c7ؽ BzZ݀o$G/WXe._Ajw9V.Np=OJ@+͊td! d6H&XƫmHS$bRj;8B,]?}iq1O}1IKN8ʀYF^xSY$.14cJ%9lAFa-o#č׿8R!0OsJsjؽ T$zxP^l\!t[i]Ec^ku_3ENYvũ+8u;n+aWE5%c׆+Ǥ?*Ssq2rʙ87<:m Ch>d,; ^#Bq#횉TcDUTljJ!R(m+tJNDsy:q;!b^M0-f 8'{LgkTΪɴ~/[̯+b8hhI̵}8U5Tq$O?.eU3~[PH 摮ۆ!pZY,i}OJ3X?n.^[^c:j6jmRE2GHJ>Of,B6Y|Ieeҩ=?Ugq8sl?8ߥ?x?]|xN? ̃Ke;H$,V</g?>ah0^34Ul aʷWnUqK0;t#d JϳoK{jy.ZNM@äp%q=oUUݓa{!5Up׾! vx9( AY&gP,3Ha)9sFk7@ɏ<cա n(ƞtӖ3,eXqJE!mpq&4q_v@L5lU{vEiv>7U]x_of__r"\D%0So~2rWd Wb\5;fJٽ8fzJ~Q_.wzͻ/zF߶<M˽xr5OxZ]c e & ¼&BP5;LNo1j 3ꄷǛ%g|3u$Rf\O7)=y%qQSփQdڛ`v a "Ы0u&eF(/KտT{a"2u\&TrDcP\NyA-((n8̈NsL(QmGs+:2,"HƸ#aREbu|Rg<$aH^)pPm}g_W>zTLWntSt^2cNk3D(S)Q۫_zytdɢ`.nfAJLݯ%IJRj:H Gj6_[γIV7z ԓ҈Np݉px۴}2f2[ݖ{O2(hL-de}d\}Ξ3~!C)Nxn9d^,Y"}l-wCw2.*:?,Y)x gdِClRtN#S?{7{C*;#\W?wn9Jꢡa_bx^/ răw»q%|`<!)vɿYם@vkjHK-{l[C ]]fy_Oj̗]L}쾐lf5Zwڧn|do޸ ݧ'ۼ˘' vB=֗}W8e|'N/t-ܦ5m)6D w~"mG9#m!/p-ei8lVsㅋ @k`$7 Z#M 3TvD(p%# *kE IF[gS* 1i evEeAcppO[=V#I3WzlǃaD >Hy5aS9DC4NMֺ6A([r+T։vЉZ@8NQZOŴw $SH)m4"HE LO/o&\^^x^9 +TFfhSɸm|$&ў7<6Z4(s6j [M*FvBVBGcS[,DS>D@*mSiEr8tRPr'v.X=ؗr`jˉړFSJV1)(44 :iAyBh$L.XIUE!W[T֓[I+bqa#<;~]ф8~/DT'">5[i3l޴Q2td%YuAbkE,hP 7B $߸ҷ$tZL1fbS"OZAb`rv(PWGb4X`d_.ꁹ'.N\|2I~(QdlC(Z`6J\ /Gm" ZGaKpq,<ltC_awWJs|* n~!PLdCpQ FHJd@0dtjShcM' pD,%b2 (ripk]Z6;ATPE q؄ɑKޅ@$u BȒOlamĖsXM1#&\-09}'XCCA"dD`cP\%elHUBT m4Y^b<PIȊq"z(\co9e\2ܶ؄1x|mC.xPxs=MFPdόH͈{B}=sG I+Uם/+龄vwwC#mm~^ тhֱmCn%mN `*XqrnV޻Pt} $?.]5~.2I@2E}u=ґU"skWhpIs,S1>5y]JRV\:l6{i_.ۺ~7?_t`zZî_`W Nΐ_f۲߶Y,q,#YZʀw_f4[N[U gz?=Z/[be?+,Ԕ쌓t7ӳQvJCঞg`]/W:h$|MM +xP/oS/ݶ"u#jQ h 4 QS`dc`C:yU;s|{h8DS=jrٵVo5X4=UaTT: E?*T8D|Er|8W,wP9X-q8% &\ K/jX+prWqթNW JQMUz"Z W_ pǮ3ὐxt\lՑqO;2GU/Rq { '\J'"\`˭WX^*qIJ BW+ W X3cD;qJ_H0W,ZpEjKGNWF D_H7\\[WFJ!'g$qeR0-]8scμF8ܺkv[ J<IXV1*M>AL;餕[UHSXWҫ W+T\h"\` P XWXGS9=b8I\yBq(\\\Z5|3V̈́W~Ǯgt/:3O=uO;rG?HU/V̺=p'\R7"\I全W,WZpj;X% W'+5c=bc9Tj7q5* &V+kU-bǎ+Rpu6{Y P Xrc9qe@bE"ǯMOƺb+OjMz\YA7~)r@նlvZ[?-5f-#kuLGw- ]NG䏼IOoŃz Y;Ы;ɿz;z76L\Oݗ[kzpC5jB ,טZB C OY|Z OTMb\OA?WROY|+ẑW,#/~brNW(kEjpjX35[׃8i1{8qD=n8*q\:A쎫ziW$+[ X.Zpj-W W'+ew"\)@jpr-Ԃ+VqE*NW-DEbaj=R쪧v,"L\'j4=!T0}W,WDԢ}̎Uj=q3 RT+d-bƎ+Vi) QxUS̎#V+D)\Z=|3Vi&q(+ )qY3*7JMWrǮg㪏`G]쑝^j?uK+9j߮^<'M5bG.O T턫ĕFpEص{5բ;XNW5o%Ԃ+݅2ǎ+V9$qeW+#Pj] X=XuNWV{7l~pD|۲{M#m5 $TRo芵<ՂfPzJ%v㬌V~7dF4dGX_wz7񨰌GNeɕ뱴XʊB $XzbX.ZB ܰJ ShC N85EB@iꉄ\DBI^k9qRוCjprcRkX3"tu9$XprUz WJL(aṯcW#'StG/(FKpoKj F_ H\Z5z\ʱ6Ups * \Zǎ+E-bJ:M:E\i-[[ XZpEjK:WRM) 4"\`']\eju~"nOP"\ͱ:v\\YMcW W+ a~txn\(E;P{a,ME#_#A n쭱a]F?MԐQU _ET )nvt׻ ,kR^WKdyμӄ{,>:&e^QR0G,MڒUApLE*劜p?GჾKt< Z]$(. aO]%<v=O#@:vvgv5+4ŧvîdUV9nJAgaW:9ӉKaW"UqvEqD~?ag?A>[à$G]+zfWn=ZsyB 鰫.®U؊gaWDP P'îJ h%ή]}J1?~HN8b\ff^&|12<̦^])O_+^'&\Ibs#,VAKҞͧILGB\B͋2'07 Y^8`Aba3 p#!/^|j:> =q5.Ў-}ⷷo f'?_A?wp x9pW7UJL{5멯9OVX6 h&_Rոao7x %:a^b1qٴ-UV3ܐ-|?+~4_Uۃ9۽]}Aqy%~Ӹe-5 V52x,qBԊn_"a$ 6k#sjq>!r8!1I# Ip44 Hw,֑H]9\/ux]s9S܇+vrʑ!\h' `T9F$L#`"'-a$EV 1m`6IY_ hnAr] MLjzXh ոnAKLJvr=sEQ~~۪H8dqD5qV2v2^#{/;jٯ&{q<&`Y _wވ]ӲZY*}9aRaep`U:r:p9K^l?PkY TE4H HbD3V`;L3GxpE3c83g` UG Y២LhBt3ɵA{ĵ2r `q+cE*L<`S4/e*:8%tXYh6znPXo`7 vyҫ4y7T =Z;awCy;*:ާo|C3@c% (BRj1x:"PTKeDYݳ33d=gqYDd9:"Myz)GjCh4 Q2yn$IYP'<38}RJEXHkoJZ41iyy~X?}n4{_!)Cw onl:=K}0DmC@C^NP}lg6p6^X0K{>=j,1ȀlG f8"8 6`.{SM쿮3BT~2jqE/üDy]_#L،BLjMC=}8u8b0Pu m$t2v׽#b~'aPᡫ5hn$3^e]Ks!;x{k\閭i$Z pטD`1^7 euqyٛWssGBҲ|V(,-|U%hٖlh0,&UZ(d\vuGHI=!GHnjaLN1&xrzoMJ6]uHh$dSߌٯlSߩ5A6}BΪ="hmxAGӲoxcd,e.}w7Hm|QaJgaJPFO Bkm56z=:t+;toD||Ւnk bq|FUA?mGL¥lDp=- vzNN 3:ч9/ҮR~t:NX\M(Pf]HY,Y2kimrHٿыΑnl9= ^.g`j£U)7BiY|{e˼mgN*,ZήRa0&E=:Xyåk̙ɤ;N?f+]Y3HWakZe90{ UլZd4ewJ˟.ijRnp7# _8Gl^S %穎AUve&cxM*u@Ur| H!|D2Bϕ! x6p&I!㨇d+R3MnvÌs-ҝs"7Y 9dWn+$982DQ ^E!ɄK͘R)2 [`"QD9g=Ef+*>Ha|Xܶ:,:q/kgsH99A5`2!`2 X( s2Ko5Y9VWptuv=Nomg])cwEe.7M5n>ȹ(C{0\SE`,(3xnUxp=,C&zQ7:cHsU͍IH a5 R|V{m%`ȵ&P: C!fH \?Y 'D=hQ 9T>cojVzY7ͮg48ɵ߆u~N-gd\ d?8.co:bt 풊tp>'EģԄtF zsb;I4 r u;y ,Ѡ1`02QAsgBR*:dzwY_ϧ*v~붓: /g,KvE͊~zP3_{0!GS&4mJ\ A*sh vtD&A+D&A< ?1-!(T#}.$) R*{Dc\0 +-v1wڥ"8`YB73ø` Ìƞ5_>tHR1xķOM\^K[jg55vu'^4#JY_}&}U- @B"$OjL-^5AŹWXJv1U02\u oӚ܃M3F\].V%`ui{}XwRv+F~r{=sn/dՖ޵a#ޛevն%g.UW*^+ 1A}0Ad&DmHkdzJ ?jUggL$:\[ӕtQ','Af㛺|ǐ~EePuA']'j">F]`@rxh\Ňa.LYK;(+`amW  N[l`5!!Pv+P㳩װncM|:an?N>3Bjcff"uz^.?m&QNDsyJpSnU^z5yW/@N@|rUL˶ީ5ݓkmі{E5j*AF+`./"NFR^ HVL. oPduͤpLk9 GsMdyDC1`Sji1^.ts|9ǢsTl߬˗욵]Rf-S=,itjgeȦ1Klc/p6moA59s?՟]#"_v3Guwu`7 2,K5M*"mO[%lZ5E&uTu=_|Oo7՛=8F=0K6Dh(6V?^mk޷|Dkz{w^ðVw#HZUfa H׃|8e8 %efģw͹m*G 'ܥk8#n[H%{#-x] ֵv% +SDv)6x #H*Eˌs&+Rq1 rdyJ"%)l$%Bھa yMza/a#Ţ33FMw&hkIH!kKH\hb)R[tttM9q`aꖮ׃=|@y΍F[hU(n;fd u孟6׈C dA"53adU|OBfDљc`mV[drÉzR[iViB Wn95W,b~uݕڎSF&ƞYYRIͬZ"^XT0A:.mD!UN,{{" i P;jl^*`O}x:O_GQ+P4ћH]QSk|p%CH:H}mZWk٨]RJm2J'01Ҿ$N ,V0F-Ø x t@Bf ȑdaewT6%c[xGk3J9iZSob>Ԛ+~ 5RM5 nP6іT3֤ }O/)\8i2 YK3r9*6ēcFUQ%{zg_|ʥ{MJ.%@M)z༷ qY%Α읕*2&N<BqfF MV9)g 1~2I6zG5|Bwbc26-~8ϯ?}h]BrHNaӶ ÇUXnP 4/esj*-P>pyqbzKIeh4@#kRQB2ӆ"-9voDfd^ݲȶ$ͩ\,ܧH(U0۞ූ&1r Oxf1eGbd I Ab RBע;`(R5fC(n/|>a=r0zl0߃ook]_3,]զQՇzϮ>;bÆ*n?ht;Lにy,/r)#C |٥NѦ|]j\wHm/C?́Vߑb55_^ol_UO7?^m5mwaM;Loi{$Ka3*IKgS: \JGiuKG)7\ґϦt$  8nyL]Y%t cc @֍ӌ71aBrwGyJuE$K ,^ltj)׏/+Tګw[w@vܔկxR's>QI2IJXuh,'A1Ō+ZYu[)G=+K***ߟ W 2 <3\x+a{OzV4$@d7BCTADP5cMc\zZ)bLk8W2Tmd&XTj3@, .ƇX'n>>f#=˦a6&ףt;Fp\HT[q eI(FרGǴfa@@P=4`Q`Sj4x#l$D6 !e ֎%'㸡bv͎CQWFmޣv`hiCJm2s=aMbR"Ƅ=g ,@*f Ґr -:dk"ɤEXTi$+b2Uj܏R̴X3:}q("ʈ(zD2iQd/.hN,@g8f'| FHAh$$$Q7׶E@ҍmֆ CO0Ҥ C$#]DC KG.:͒CqQVEbw VZ\ heD}%qPwGϢD C9u=.fǡx@Xh8R^w9ڏmWKc]%Y<ϛGn+,Ɛݔ<\ m7C\Dq)Eǝ 됓 $cA w() %5GAQ93F 9 &(t1I*,耳4%uQ;cjyb>Eg"EG!7[U הuNfyFgT2M)K&Y_!8㨝t![QLUJw=s3U*b17>9PqAstawzJf@`>F}33 d16ZY XM؝ܢ{3dRGC%ƃ. d;gtl,*@:a8S^ѽiic=E S )EHKT,RIk6G# ϤuQ`sX^O^Yi1V0FA}yi>wDB2A{.Sh\EǙ-uSXNX:ncђbmh}Db #+,X@[CJYr&}`KJTͧz9u#9#R'b[ .wy7=aN(d7ÿܮ Ҡ @354!j}E Hk4Q8,)AOZxD; QH77,Uz8{;0n7tv|.qlcwAܢkovc4d8,`b;'IhIbܑJAom M[iORbc:"NFiIiGȗEDЙt)x8WR%NxTOSXˮ6xKT!:eFcqwu'mhMK `JZF#yWنogS{NWjL4V!f+Lom,ozD2b!&ߓJl&r5M/hoy{Vr|O ~՟WHS8-Q >EV&Fi Ҭ>Fr~ GmCY[f0u'V=qLk|[T8]&T0aWjÍ ]2mSH-kh_@zѫߔ,)q85 ;(?Qv.-^TSG|&x(GQg3S2&jDh!=@,5Z~8k,ҖωQ{aLz1OъpԌ~ƞznjJ)oYXs#. 1TgT>lϧ|(K"]/Zۗ}CAʜ \i vp ,~8es}6pUĕgWEZy*Rnl=\=B H\ZwHJyW,3"ٮHܯA+UԚpUԪ'W f6p ~leanY$-GI5S;<C[E 9*Zu.pEjaWEJ%zzp%JM.ʰ6lX1tW2-VoJa͞nĔL>u[?+#`̐pMR07N{dRC~o|Apy.6hqNllDEوHړۨ߈YnD(z/0L t\of osKY(p:&e^QR0G,M* A8b%Gi ՂC{NVw'w-#3_"ߌ1Ak  E-κy,>k&6ͻ]rżHzPeaKtx]^Wuux]^W ҆T^Wuux]^Wu}AH_awާ&0PJ=uˁ-Xy6(sR&wTaƹ`:+%7"\rVrs;P͝hn 74<0Km*CA*U<$qxB~.d{g i2ZxV`-eHj<6zgxbr7n+#]3աכF.4̫#`uY .AEmđ{wlѸ8vXG a,3J &u*sIJJqSJݥN}S #cDmn8t{Sq]|p݄ƟAn?98qB%Hu(8bk6|z$,֑HYqȚ#r9͕en>?^Of6=6/V}[a0,^E4 P"EK嵓rC,Z3r;pxfN1t s@ߊ2 Yb#-{o8{.P5|!#Wi _T4tstM@-[ #+`%JbC,I5TȞF8@CQ-eubZR_|&~}׫3Q>YPqK^Nr8EZ3⓫{H.D>!=$\72̤Biw"Dg3(Erw_ 6dp-M.H;hG]1[lv\1m^1AIX ﭽZ~}FO4[%l^@a3zSwuNm$}$uJFɼ'+䕼 Ǘz^`s/]C+Y^W?{+|菡yҙ3's㰬aX|Rzu^MǽW @eul\84Ҡa'l [[9*o8>f< WM9}=>0M7L|P.YlO,W=7{\ms'16o LRn?(}aV<ΠNx"O}BWx?H^=jq2ppL+r9[$ѥ9CRB0)c]{;J/qZ}m7Vv_іywLsŀ}y;h:Z<%.:t'B! .6m@t!=XZ3Gii%HPE ?ǍtQHN3v ĉhiGi!A.xF]HLY Rn> c]^GCJ9 ;nʽ3ø{K ÌƖM&*crwD\]LYX٧)D*4޴؆l|Fjb w kJĨ6xΨjgokgGAZ( T+ ΅y"kqfUJLo߲HׁqJQC%sFR]F Ɇ@ܕ&ww==]vߙ؍\ "8) R8c`eH JpLSc5` uNM)0q01QFYJyH@\KX,X8z8DC3?NG,{F!V]ӭE5y)B,81cƐ(\ Jg) %Nh&U[9Xa%^ ڨY5MDI&ntsfi%1  AyXGAC13TڬP9NZXa2'IdJQ 9,T"xi΂0HmBD 5uQu$쟱:niRX7( ވ4dae-HGSC4PB;)p9X[:AjWvsb>0H9E!P1 4v0BZ9m Gӆ5c 䛳nЙ˴'u >S^Vx~"9+~یoc(}[H2 _1P. xPcz|Ʃ<; !!-rzN?$ Swjpe/oNPWRo%P0 *:m K[O sr [ߝ83JL2$sϡIH&k'nkHn_=A-2,HTkm \A?Yy4yGߦN\F!c|_cb.np7eg_c!g+^,jYw(nv|Ta}@L͟ъ:`lf7e;sc]MYmplm!V͹R Un#m:%\=ŀ-}]Be61x>)xkfe R*ML‘[b]n߾_~r/CӨE/}Nт_* ߎFgCnQSzwde!],@jl/5Zͬr_[=59>lC`lLk8]6o =4kf+Ph@J`Io~eY@AyUHyRՐТ; eZx|b 2@e6{d(=&ʑxR*u)7X Y X[D Yxo?aa2bU3H(ԧ |-UyOf_?N`|$8P>J +S8/}7ed*P4ћH@jJWt%EՠꫢF2\lQ:%uJay0jm KXB7;h +wySu7Z/ʃ$;H2[I%|!5-yThwUʣwM1r#;&G_~h6],:G5q񎞧ķj(+ŹJ-| 7w DJ06ZրLA D[rM[~w5#O͈lOELIJ6(c֫>WxUި`Rr)i2mJ>b\Yx+jTH+4pzh髕gggE9CSqL #wn~dbt ci|Goa2I~}?-4ZLo4V=oR糫,'"]{E_ɩ퇙\L-Do&bR**@Hf}izc糬]c8?cY[=Hl,$I-vyڗ= ޥ"V[/<-굘#6*|7.TvdmhqiPr5ӻI-1b۟>8nRW)Qϋ],e9n]o*c֗dVi'mɎqCΠۯtm;ǏtO:_OW%;Gm Ko G:vMmwU?" ɘ xkYԺͲ#tl/H-j٬z=x\_G\yPvu_o}{> ֗>rƗoQg>v5׷nzlڎw>~5xte8a|)Cv"m..ɧi'NyTjQ!dl,3Ⱥqqrc1V\99kR}[V?'R?R<К [- Q ڊup3rJ{5;w@v=:?pgʨ"HX.%*yE6kyMvSL$ UjfE{id(3 }*CO$"pa*j<>PӇY-MoNH_ lTg66{ztUQTI`R[ {@30!g"@M1{h$1K.zRl=1gp&E͵t+p@62V3qdUaaq"nRn4H,YY|.̸#SyX ws3r3/~pĎi&W<$Ue!HYJHk5*1%Yغbj1gE'Xؔ -zlsHԵyt\̾vq*jʨA6Ċ&3c$Vo,%"iLH3y2@/aVڸQq !L&MZ$H<5y1H"&j'fx:{"b@LFUMR1t6cVH}̧`dDFbARAuWY$8)fmh>MIs#@#!B"|]䳚9GN=|\SZ/9ee\.xP_2Yi=rA( gYZHQ(gQP,ρ}jTng8 \1h!2j":2S6m RS YVa4ӘA*RB([ gI(k#hȍNc5s> HHQ<(HɣdИ n.xJXA"fǼZDz٤/f ] utcQ1sʠD,]$/^S c_1"CE♭HDȍV=:e$kk@YJ, SGI-98ڕ3I'YBjj򈲤~X,`WcWYQ.U‚AC8V*`.%+*Z*J-*ʥ(yV/0@U `>F}33 d16ؼ,jQ;؟ܢU>,K]"<m猗NރME%Nݠ()y}.g@pԁI\J#@'cH(<;_Wߦ<"8 ep:+1="WcĈT0g9O }IMBH3`=h%x Q˂8e-P4 s D}>B:0ü ˪;VS :{z| &X9S%CwC.xSp ' + ~{,# /PPaKVubub.KYG koB p]\!  b}b%^!\YW~v1pU5îZdX)W? \#_z0'A0\=:쁫Y/ĮdP='W_=pd @E* ~)pU;\+p TlU1|\ȱpUzpEL6L?.Zji|tf-`k𛢪iU4w,η +~`̐pMR07N{ِ+QpWCW-}ݛgܬ|+rvK۩ךh,hP0Z yBgT"[-YL3 S￿OlG%d|[)}TԈMQǷ{"~`O}/G?rZ{Ls+5ؖx_/˶;<>t֑\=_ >+oY"}Lqh D ׉^Jg &W(cQ}-OH\ї4-]Q}!S{;M9:[]׫__''/_R=R{y-<^KYT;j箛[F\/7n%\i25CFDQ(aEZ`b"-\i!k>RC`UFZ3$GLQgLo[a,ZenPy\5G rRBI7mz-y}_?/:Xhʪ@j5bG <~_R37!T!ô*zÃ{o}wقǠc } LN7fx~vtO*gzWMwWF7]&642ɟ??ܖןTNdtsA"~x\>n]wBh%'3zY{8S`6OYYl닸-ngՆ ~g>m9"G~IM9_^W+S8S||k +yhK3QnG^yAu,g'4Ұ{.{m.5U҆+Gad:Ǘͧi%B71l;8uy_Nµo~'iSaX4 S:0 蒻F\hwh6,p32͸Hu$"W- Դ޳(-AmzLDUI MUfGN^ʂba,5Zjyue;\ "~k]@o2-`dsqB`*#& {5AH<%*EB Zm ȱhPߑ==hf{Yot],+36t+˜K/'9jwXX,Q` ګxNq}JօL{i $3˯1*(rc#BX!^[i hl&r ajg+7"jvf C=[iIu *TAC1Η"uk-Vh=;t2a.ȎӡѨ i}ohX| HhM!ZЂ.3FuUyTyYP_֮'-#br`DoT8TDIb^N) gBH M;۶? 8&F7:Vd6Vh9_UIqb~x9."]&+7*U@Ħ+I'T%] g ,Qi#(M1AdF͝ 5Z 2*GT KIwT!yg#g^>Fآ?{;Zb٢ST,_HR]AYN=:fД.@**Tk؁TH[l/ ׊H܋#BОK<1GBRR&uټ`:=V)(;R;(kP bfT5a^Иf#g\>p=H&7S`0ŏ7+>e–*φ`,K_ ]0RZ^DWTD\_K]D^P>Tm^|e+Vn^a 7./W`Yy7$DV<īֺU κwS*cUL@yM>jb w gJĨ6xΨjg%+g#Ûx2`vu,./dqRr]}#,,j% Nb *0":m l9%9r?̽;,G_fD~<բW0*Œ 31 V[`:45Vc-юQ $o3$\ZM؄ A2 )eT֚k ǛTȳzQgI##YBN%}PFVݎ][i.XVEOmẕ,hnЙu3'"R #,`f,҂KA) &©x ͤ yݖ=G1 , ʱ=_eDF"Ű4N+ l (O;@2=_QX1T[ P/Ywa2'IdJQ9,T"xhfڄȉAjEؿ۟h)ӏ 7"8M"{8YhJ0 k0-FʻI)18}` rSBʭc: h a;9SwO*+oi&I|Ƹgu#g&upٷVSr=a^ vi_?VI~ ƎRO(RkܳhPSfl>ef'X" 2ENnedSO 0E(\ քTKĭ'W3r=[]4pR|T)T垒8{7"Uqj1l׼UPiºNN`i&Ӎ:r6q| a|1QinҜֽy2DKɇw!15smqпז.GzIeu~7[A!6­#1~aH0acHuJ>~ Ӆ.c XJQlmKgɨ{$ ́}<u*8$6N jxЩ"rCxugaqoc㻋뇋w?~D]?P8i#'˓X'y#ֹCSvZn|q%9qlJlJّ[m J{?~}7.ݻfn'|aNuw:YttM6سt yKϽ9BqgeɁg v^mKPN׾[܍}9"g4`[($TS߂澋Cl>]"=GWBp G X96H- kd\iay %Leo9cip9k0o|܎%?_B4 _$GKxTR#Jע%"S핍A VwUNlTl*TKM&Wsb/إ5o^HB*v0aֵ]de_&9b2pɋ" YG'났CERLHoDOQ5fc–luVYD*i;PJK.X0PU&._D2z߆E |wՐg˰bhlR#K`<=ctܶv j=* Z V\(O&`֞xر=-wF|˜EKLG eZ30{-GMFS^f#'C"#IA -28MBH@;Adho%XacknwXc/}ym"8G7}8t,  2@OTAUxXCzrl|$4 ^|F?{2ei5_Lx,}RTO]C{U]H0 '\I]ÇhY~YWo ]}/>KQџRQ?r$A(RaO$@6:'ӔH8 diIPy÷LXbᩊ}'*%fz/8oQ9-^KoZݗpVlP?Y;v[lg"7VuUaqcZ'm؇.&NZкj]ԭ7w9,t:k%fn^ͪChY@ˢݻݴz7ySjZnTK駛s& σ@-\_q}0+ސbƚOc6zywKgmamS;6Xt1BzE#`Ы) *^KH(KGTtwX:s-ġ/ *<[OYϜO2p"&?}䙅8-:O9yW^/J1A8%_Rp8KaŅ9JOT XvRd(@ӫUF*l ڛL9jd\&ȸ,eL,\ v'bWNfQ݇Oiws|ي#CKpD\EbD`Ĉk iI|6>i!l#6j5xLpTZx01*x."ň菈xD\xތ"rB`24h=ҫ@XPZ9X b9P0Be?&ΔV̽BՊHڳ $-Ir?w,u.^\dYUe6v@6vz۶ﺎO<`3xt>vŏ[,dWZ#)2ٙkm'0|hqp| ;nO&VO;22%dHpZ2{e&A:qkc-h/{= Cd AESi4C|q!Q.k/wǂ@&:RBQox9'Rl|5kf5VmeZ\T9ԭoMtр}L^juʩbk^[T5\]erinGcr{ yQاTQWwrX 6CPֳV%JRz`4}ov!{=:U %knO辑 1ܗ{h{1e6D_QWki0M|  P86@ ++uɹWSZC)#\}pQ~>n(hx?Z:s}:f/UHcMS l0 :mNǾqgYy}B۟1|:꫄$3N9ۯώ?m#6d >3Mb[Q"i)G@ok1}n4'7+u_ | xխ=4G{?cZ607>c?=Ի97,=Ւzlj)6:#5_HJb t*AJbR\}}+8@?}$EՏ^];}o7r?2U iFh70?HwL-&,׳ohe^\KE SHrP|Qs)b#ڑmM6E0H~jR[c5(wp}tVR|K>WH Eb̙1!~SZ5ȼvE{ZH)*r=7T[4FDGtDN;D T)vܝ)&Z[.=&Bc2#VIP} \]4jى6 DxD ̉=6LM!F `#Y9>@|v8=2T@\y*fv,4*05 ͬ1Kh_w4 @/Yw\g|Kz{kpr%_cHGRu*L%GyJ'5wtUUR`-܇H9Jjv#$sC 7Uu~l%Dq`vs=[)z5[í{8L ,?d,Ȩ.oIZF)vžH(,5]lT1pҝɬ$Jsb-*"'m @]0Kg =wB>uxVho' Lg.+e`4ZAR"K%V Z쮖EbRn H<)XzwYDw0c)ۦD`H+ 4,:$0]?oV/[K 6 i8K\Cʝ4")(Y-D3D-lį0ȵl34H_@`7؛U\6ETUP^: z+lj{))ř>k 5Zݩg.:p6d`"ܕYGfir68h`+{LOeŁX͊+> 69pw3 vt)9?`~Ah{NPێhʈze(4h>?FcK ᨜$K4XXfāa!(}@XKOXR\cE˜l: GlMh5~!]:>9Mv'@"'J5[sW cSXPKJf*%d} 47Pl tRUCy9ۻ1~2#g ` Q1-C1{ړBmb3{hEή7UǨcǴYkD~Ԍ<\ n:˘Ո MKiF٧F6p(^$#ĖSi戇*xyF`CzDkXjr5s?#"_7#! |&7m,w cTH/ V )8zydV)tuOuթ:2A(yT T"^Z`O r5=5jz kX. Ӌ _D ۃrqr ol-Za w((A&Ga0⓪^uU .c  "eѳd0%n;b^,DI<(DX%BGPՔq\,Y#9~נ\U8/pQn5}Q¨}`5D`ae(uaA8 fY -fQz`+}!F`F3Ƈ(^cO&3ҁ²t`J8ÛṰ0x  Z6l!& *V.:"]v)&Y0$͒&H"MXoS"[c`@u XgL0j]za"/:YdAP0٨*S&PN9kO]{j@95cT<֠[,YKc,{[g~E@}bIc1%!.~"&Sdₘ@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL2^zOL R0@Zg^)2S@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 t@ k?L ׸aaw +.@O 07@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DLr#& 0P\ ZL yb=E&0zb@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1jjz+^{5TSu^_6 W/@S)yȻx$0qIxC\qH@J#K^8G Z썹BqsYnLj͕7Lj'ţll$$-c$|̕2d1=2W e!2W(b@Zv\BzJ(->+Xr7 {cPZ-w\\1W(.wbPZv\ʓzJiΝ#s e\BiusRzE +n s:/ 5ﺹB)!s͕¯ܶ7KB`-&C.UA(FZ\pJ{jGL 9o0~Srbs- m]v\&-spNNw߭C&-ny?gUړ2zXx *?Tdq̸F (Eh gߊ_ڷMr/'/PϩYW_0~.[.[6E:.ll5coWR)ҙ6tٍgF|5+/Z/\]Na39:<23Flr9|EU,X#rYST*e&敜miZZ>8,B/f…8S9<}̇㖏2 7Wo>]+/zփR:Mfh.@L/t6 ߗ7//!cH.n_̨*bI7AD%&ZwX,e. Mpor#cГ1xbzt~ hA?;nӽ7'!XΈUKPN&sIE6(SJ{rNS%m:eFJLr=sĶEq%KS>j\*h u+Su!9> Ų*+~#~Mc 1$7#sNB\vnB.`Ѕ睤܄ t5A泥?^I險 Gcu,QMs |Ic4&͋nUS6kxzw>PdM_$p{]muV3)򚸂"G,z,[Sܜ)k+t4)4{!mfQ !$eh7 f;,Ŀ(MJ!d!Bjk!FB{!dIؤJSYN'4\2ֺX!\g@)N#M.\!nBg/\>p̓A8Dv_ƺƙ`l0v%:aE1W b3NFpys:LEQߞeq &Hm ljw2UZ[>p> n(Qγxp6nr: oT\Z}uz{9GNofo?ХM*Å;gԥ`Cw1Je*{aՋ(t^hYJ&O>x%suq ze[0{!kBES=*߇J/BV] r vCx|KU.4K[N8p$XRM75e_Bt<&<{rW6)KX_O:V NXBԱVJQ1k/d; soXI!u_vB[C)\0pК6cpxLCmxŸz^gYכ8\XY_Pajҕnv WrrxGcP;=x-)gtVM2\2U0Q܌&pa !E2^YW,1c2} :[½2. qr.MVؓ6cs4–``Y2%Iu `5lV:@q`$_Wꄽta OA͏go˟<.ܛ͋$p ]"H"El̊ p<MΛNݢiroѮ}v+G@Pj fir $g߇[>=r^sm[p| Ći~@hTRhܚ'^leGAS]X?zMdŵE4q,tV !P b Y)E2$5I'ZZk &igZpXUz{:2{?(_qTgu7>BQR5ޮs/9Ab<9eOgVg_}w]gM'.3[2U+y t✅ﻳrgNcXEYehavfˌaׅx9sXEi˹5Zo f/X(+kAAC *R3CU,wњ }wOڇ"<$D<1o1 ʈ*Klz=#jBtK ݆V+_2Xl[s2>t#a[+߽^ݼO>;5[*oKs4[e V톃6R+W-!AZ9`Hku\t6k@kLR>beJzVw̾Y+AWQ/E=뿖7Xv0"kR)m蘓B ]#(/N|Ԛ )uH҇Y:jQJ';R1Z_Xc}'ڛ87]BTԛoIKvfPz x%N_Ѕ];A>}պE<-r=fOM!0NSMhb X9UL,҉.t(( lBRCHR}es]isH+ ðȺ ;f6G2!()DJp2 B嫗<1♡0bArgux4pgEX "tϤ,hferVW9Oci[˱HfܸݰtU-_@s̅}rryzWPOH.bN޶L\q|?ä; r>w^4IQ|Ս5t`A+з7}[W;'@<$.]5]t<\޶]_A׳Jd|I_q'!{D `TpM(D^yA6D^=I&XU8llV %)ĈF Rv{1q6QڟO~_ e^]N2 R#SrC]JBDprriٽ^65eҨ qN3hJrn>rL%N&)ͺH6s3T3]Pgكg3V "}$V(<K)U1WZ'vL2v#e~ vP/  Cu0#1 5\mrq "Y AJ.څhp`2Qf$P1@#م!S *AL PPG9K00g-4\w#~hyH/jdd /%U;:%e, Œ]TL3@peJlw=7?x.0sl~IZ`8rhtN~佌d~$4Q M4]?=`8m ӆE5\o\7Be';Byp.k4&>f]uf]%f^팯q;( 'g\&Yx\[>4?R{_ klތLm`췥%?Cyw׾mMR/x|_(G[!++4W:0}.bx];3cU.γ78{PQJY "B#+&&)ڲо^_І{Q}&vKo>*y ly¡⡿o6D\cTI|siw?aҶyvhaR)P)E#NlUSԥg Ε {uܚ^wvh>EDFmx. .ڊ۶x/ _ 6 b 3֫:|h5~lΏ3ofjvv:+wQ݂1^`v?4α77 ]yl\'׎œdБ^7bh8 BW(!P*KÕ2ճ+G*)P^K'.觅I+{WpzSK^\QřW(.Hs(pJ%PJMYW/RT+1y0pîB;\e)m慎G+8 ʹ8Jz(p5j᪩/ zzp3yH{W(0'weP;޶$8v/oL> nٹ/ERwl#ٙ߯e&b$]f&6.18]]:/+?O>]P btmgFhc]³va0F =M~/&BkY9UT)8/B`>Zʻqz;6'->lT%e/n.)HwreūnnܮKTe89N8%'!^ɘ#r] 3Y}ȗx̋׳+UβY4ϴxtΏq42+Z{]ٺ*B+M C S0r0xPU3qcs_)ۈ 3͊jTZEp{U&((Akux*_-Λhsۮ fF3 Wu= zhmR1oJY4T"33 kjP+{cr+?>l[V,0zV@lz^Ћ弮!R4L-\DϙR7cZAjk AjEWD؏QU|ԕ tơ"#+GWDٹf]GW>x#]3"\ftE!U|H " >]MB`G"v XWq8Z *AZYW6F%0v7.BZ-e"g]=ݷ{>+ BquEF+fPKRltE R|38F]8 "FWkhO]WH Be]PWFz%#]!0H>"\#hM]WHiD VK@:N;@_nfiUUY#ԴSAHHW<E\9tƃ8Z'RQu5B]yMmHWӦQRsB͈2M.DWA[<#]!W]թ(}~|tz6} SXU n|tGړ7$<6D*d]tEPh>"\FWDQzȺ]yXZFW iP(ͺ48#]!|FW+*Aej DW ltEJqѦ2d]PWƀ1?wlq6vy}졜j~<9*njrk/Cin pI-X#+LA%Yv{$#&6Ax&PTvSw3+!'Ju]!%dϦSOH`ylC/Åa䍤5 Qt%#t%mz)xpцuAڬJI)g+°JEWDQȺKFB`:]$Q\tE.y]eȺ &K Te  ^,?^^,gq~iGoNW5NgsYE(M]bJd/~i tu[:k=}X]_.VH,}nϭ)T3<ڙ"@EqUԳN{Ue6C{SLZfP #e]2oi>m+<چ̑ yr%:+ʪBoˇ{? ?e=Y ޳w|W[l=Z)HDR[^Ŏ︿vl_i>߹g3vz%a_W= glןQF淋߯'ة7ky5كtQ.U~=[X)"_6y5@X[n?ב؆ܑ>cL%v+JYM욺 'wcܩ1m (n L.^QR5+ fH 7~lj7Y?4?^m=a?ꮛz`IR{e{Ej}Dg^ +r.rZ-VyXc{-ׇ@ǨFCڌ~zesU>tv=楶R^7AgϢxCܚTnͼVRQ\u:Nc. =ymcƚ}+r[W++Ό9_ 0Gw̧)zNw_o=z'ow/2}⁩cmwS`M_okYnkO׷uuhoIuNkSN'}\ 6--9~LαLN'j?|ɀR8B`/5P;qV(kF(b+^œp]"Jƨ+Eg۱&mɪg @XY#[IOplB,W7%㔫OZ'j\Hd(*\ RB߱ѧBb񫏗y1Yzvu:!=`1;"z:.U)Cdn4ٶr9CZt.h>Wms69feQk U3[WE^Ws2ˌ鬋Inca'"p 0uEM靛Gw9骄ۈ,a8TҭkZ@`fjp2@^>F1N-HN3NAsZL(QQk'%XiFWk6uZ*>H_W,  sfU:y]/>]M4-ZWQ /auCv(iH%$Еʺڷ%8@(`=x8\>BGO!sLu d<7(.BZL"J#F+mf+ +u2u]eȣ1 C`+vpg2Rv_Bκs]!]!jUev@VN;aJIV5n{Ҫ/n{]=A;?|X\Mw'+pNʫ}իݔ;.m[Z,7ʊn𱨖g{s[k./0 _^uB|ya|ZN O8(>sghO]WDiu坖;{FW$]!R(Ad]PWY $]\tE.G#Dlճѕz*ܓ[=8`UxZ/襩8J+:BW:jߦ^ x:EWH y]ejsWYWO+utK6" iJ^WD *jKptE qd+>u]%uE']ltEBeȺh? ^tH<5A|[.r6DieEũ$8Z'*zI'XmBr2wz/,iJ,TN`2Qѭ:ϟrt'XD[aR86S 2@QZF8L(+mphH]WDDu奒23Zt ]ש )YW#UFHW4+!u]Ϻz>MS!ẁF9P8JD mz[tE7gVp҂uEYW=iw:TWj5օuE-:κz]i\Pt6"ܡER >0s3H6pz҈1h' AVp v*m_ejOFW3Ia N,)ka=^[wѿ X >&\&hM]H gMP@:HW9; lnVkƨ+UtEz3uֈuE2?'p#IE]!ɯ7#JLϦ7Sa/bfqj`]Ђ*ҦDd]x#%#]WltE 2jRVIW*|tEC/u:u]YW#ԕ=#]!Ss,]e+"u]!eYWcX?z67`>u]YW#ԕqR&! H`+Y]O~ )]@՘t*.٘eL:Jd#Y6+Lm |M2RL#"72-Q-tN2TT+S,1f`T@s SX^7P:!0Z*>ejiBϙ+52PLmˬ'ѕJHWl5" ]!O93Qu5B]i崊ltγ]!ܭ )ͺ> ۇg+7ۙ؛1JϩhĢɛyf1rP~jjIw+ɿ^|yTW:!mCzQ-;aB/j=~ \5J@cނш_!.]G^io#={Dru墺*TM?ջR{: M}U:?opZͤZ9VV̭/d彣o^ ߁0vټ=[}lw5OZ,˺p URVB9ugY(VΌ, MPdm7Vuk5+g }EfNbʲZk2ʸr6UUk5x`-aDmj?Ŷ~WeS_,/4U]kƀ\j.`}[Sr.fͤ%`ʃ=&ki0Bi:geh[I #HF*,F\y- ffLٝX^!6%v|nLUϔnN[3IA0bI -^MXz;36\)+1q97 X7޵q+ٿ"z,>/  bM%H peV3rpfpHr;\WX;eXdeѥt,Fa?d{B> ]l9)x;VX~b#=bT:䑅!uqI%z"h͗l닳!bfmY` %SR(o?C4BnГUj75sn=8spMQur=钍:3:9voIkNֹYʱ%9fqu~η Yh3!9U0P %% HQR$dHb*8k]6P4,Z2H5cɨ|S!'c=wjbss@J"s V`fYt7(!Q!sӐ=ԆR He`4A 6:򉀔sK(<4Wn>wK-\-2* A<2'U.(͛հ^C,02#GjGU0(ʡUfAg<Mȁ c[5:Ze S5rs1"#" 2I4>╤ӜG1j((*57i,ӫXmˍ8(&jA ۖ*hWNc+6Awƥ~870c:Yo]B9 2,ڕ M=cXZ Zb\^kVЭ]]@1Aƛԁq@r ,}rd6Q t}1[ס*h ,e"ԿYn%&Y/^i6Z=Uƅ!W[u!h}6`>|7/4Nwy>[l.ӮNsP& q@0u0uglhfѳR@ڸ:8evGcgšb1j0͸֚cҦs8%ٮ dLB99˘jֳҌԳIUq %A99j=r#as쬛UssQz u %, NU DGC`݃z{uP}V!@cPA>mOwĊbPDtNvM̀5?);:YOeU ‘rW(1T9!FSMWus *Bz= T DŽ]h/o{qr޶nz U]X$u |]`CjAڕۖo P DXb9gK6gw7~_Fo>qRl쎖W 'G&mggV,<=%?O_ެd{'G5fXk^*' xG[j7VHýkHT:Цuss c}[gfMZ {r moЏ:}BpU ꐌ8ù$M`n%"''з,H@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 :<9$'P -+; [th'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qua纃N  8MBL@tJ8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@ tU`NGlNDžu=./ SNGEI}'ȋN "o $N Zߐ a:98'qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $No qkW[軟Gm5ͭնW0Xl:$d\BƸ£q Qj'ƥoJ,?# ݆78." r՝j$*\/#N=Q;W/.K* h}^h{VEzwq[ ^s@0IQ#Zz0=t$0-4bE?nX͋enn۫3<}ul-Z:~m[Cٮ󭂻:7_ ٺH*Q=HޞL˺Y?xuT){8 'o?ޗ!MWg~Kei7󫳳GZ şX?^^M03w˴ߧk"X>7}ͱ>->u]Gn͙;x~pv$/6;h|<Z]vGN7<=GcNO8PT~7)h稃)*PAQH\hEf%OձhVd3#2~d̆eЩypwtH'Gt_zjz/X[gvvfFфLNW߬k%ev!<.&oS0أ㢮Yb5q&͊6N!r5(`3`, *#n(5J&$qfĞ-]|.=[v<̨ޫٰ55t٤8ZIS2 ,=fV<·%:MP`ɛf"{G`ȥv71x8[8p#im+?ffD4e|O:@ =d-xx:A`AQ3uVD#3"lG4,u.uΖ%OE;3.ZE2ތ$J(@j4ȏYp#EUϠPi\͂_ ΖOCix +:S65r ~f.˺EDɊ=z9G|o{ E=@8ygw?Nܸx&ێmbW5/E(7P>hnM7 3jT35 ~5n}R;VǥVcڧXSUùk⸏e#v͵Y0+23yE.g>vE?HN#Xk}nH Jr֜UKrĮK|Kk$"|XHʖH`wv=H'PĻf]Ơ%2I&Ҧ +m1Drdh*Z֙V@vA^)FΎ1!H))&-00R B(ku 2 ]@7ivīU[8dJmdӢro7F5:ck#!Dz93a#pN8A:#j㎓W0PKr 4<0?Uy +'lN/`BgP<;Piea܁0=P (*_Lt%W!1VJU X95ܓ7TǐuРЕA9SBRlƆ$% &X=szF9R<;$/dO?s)X`4hiD9EcHy,|ـ5 2H1r ꚰo5$dBKD ޸6Z2#$qXtzf9>`Jb`t\Ap<eil$U0' N~+0+󓉈,AlNԊED.O^ne3_n{w1g@Bk_J1,0േ찋4DJ]Ts_ gUe]#ֺB xw.G;Au}Mϕp_|/Z4|ڍևn˷*s+=XXVkqUw0z XzRLU 0Z4MP4aϠb7o"R{_wŻq wo!cu=B;_Fil%[[m:`E[cne##!-kbCZmj;lV9o"FcyYbE-1 wU$X C'@{>{ē|c(hHHwgM HZ-6{C geF{l2Bŧ ,u),u0BF̼?JYS3/Ne:iW3ؙWy맫]n3X;? f_ (mE@z%C@M$Lqzy$ >ͬD"K)`-O2\*ڬ:{xFɥJp}:{R^ +CV:K<9BmdsDؘ(Z/cr+"xˆ]jŇ ! 2鉣YrvNw24) FO]7C?Րe'\lLY˞jc@!\$ӖaKEM?7ˁCPVmm)&7=]LdӧuE\iwn;_#,tp]i$Qf"=oUJǗ}—[r+*~Hl[GWW͚S x[`t8Z<=,xe;o^zBYȶ:nZ3oi̷Fnd|͉geV{6oxE'{ 9'˭>~;tܖ˜g=ݼo=mGqAYz)3. mwIa2 @k/ݢ>wn K1 dև' 04=\Kq}ߞD-ԶaOw8́/j@eS_Ymv!gi}ڹ|zCux=|=\-:m}u]^pOE^.ַVQjڙ:u`1ysǺz{EW tfښtoKG/,}C,@)=\&dZ8aǏ>^Q]T2'aU"W"[y-@,ØU?X/>u#ZZ!:2DE,M(bΙJy4]<g^B5Mu;9܉ew}A&^\?,mޑҜa.\6Xz*LBŻS`Df .c̭ו mwˢ'-oxw*ZON1' *j-X'+NhlL|&J n4"eN:Ԗ 2eV1'Xr'j"Q2dc{2Օ!R h]PѩO%JUR6MKY3Rb9&rrJs vX1K3`թTL-'Nbͼ3 .ځW֠:e(7PIETPIy1/ L:f ɸ+! K՟W7T4X'?Y|>A48LFH\T*-J1;6Z=#),mueBpdr#^GmfJVy2_u_;] ߧ0@>{}qĶ*B._nZݘO72L\0$T =ˊ=: USF"񜵰آƯڡ/ʮbUvБ71ᝐUTȳ0Rh&\ +^^QE`8&t/"p/7۸ `8`QQi>Qr?ÅʤԲJIގZ_12Ti"Uh{9b.GQ޸cerKʽ\6|9`˟Ζoia<Ӱ?Vp^;擫+F5x`>n]wBjaS.[.1fmfͰx3Nrcn;P3Nb8?N~EX/~|Pײp)~y[q[%? %2ssuwۡzFuaE6H9>}\'/"dPk u&x|U.+j@QOZ7nhBK}Ht0(%`w1MS7_?|Bq. wS'X]2hGݘCr;p/?iÛF&ж(XzBcd91o3#"b44Д;hx)Ve#%<f_OMrË͊gl?.rPeXZīb\D:^0&cǥ|vbBE:&+pi` ~H)h< yM'5F),#p|QT-b{cZ,SJbi+G .Rf4I+3q+VW#7N A2P:_Ȗ "¼ +MTP*8C-}liuN)gK}\d-IKHd3BSTQ '-T@H0\C *קn|^;D`Ĥ$"7N(9Ǡ naZRڰeD F W)>Xugսf~C{FlS7-z$ەzIs)۵@XZDP%U T"dZ9mC>1K,re0eI9Sh%lR9De4Qaq6=T!. fQeD.4,=3xe9jPx>bhI1i1rv4,8CJ`5O"iGX{|ʼ;Kۧn;^˝5(唻؁RNDfG M*Q8 m&ǰ@ou`iSmIv[0€(Zu we. 82X)#$%rjUx= Ai`?|Ifͬlo qd`§$A_o8<0.eɌMMg_ǡϥv'ZAopS)XAe8AfK:p L'7@-'>tKHeiLd.pD)KL{ ,c@5CMlhLAkʅlewޱs3 솞-5;t%_3qH_)ŚTGٻ2uJm>7x4[{M6+Sa[Ueի̌' ofV~"䛶X|4/vIN5Ё;A,͢C xYPUͻ |^TzJC~OеdR=~^]ϧ[,[[і@gUWc%Kj*[Ok\-(<ָ:ng鲤)Au?qUڗce?MxW?76KfOݩZ0 B#iӧht>ֻv '^< xŝ}Z~X|}ϋwo|}sz³SCXÜ]to^^[vs_psWB~Fm/۶-ηmFm݌~f6O-Ϧ~/]|~@_n3nݪou۶m_t[vu >1y2+XOTEoGIsŏ],.wt g<ӿӯ}q*?}g/Mwnk.5 A?qU||CP6=iv9.o'j>=׷e__4ߜ]tjjă5Ӕ[~BY}e,@)Sz-ѰҀG>ozx@\==>t7h@a$aFA9LR9ieK ;J*CX (8ku]WejkWF s +Mψ:BXqSAY>H!nÕ+J4ںH=R465-TZ>?93HX$_^,Va; \2btN綞_ `~ ',FGHN <t6(]rQ7?Û*Fj\fRiUNh<;.V:B'(oQ֚)]+Nz;C}պK')[@u QwH)L=r#y\cw)k9N->Ǟ?ri7K4*:L<)ϮjajWRjl|յvoeuwl~ .}˽˷LyZ/v|6{a[sls_ 3 rOVj_ihy_:|_SbY{I4EkE/?].{Q>ۈ;w^7.>ߏeNt=R GH!ۚWZh-6yhT-'O+ j뜯:J54 Eu ~~]P')kjXUJJ0ME-h;pz0.Sو6X؍OK_yc&cռws;s:LꜶOB6WmZHvݫCrYP 9'^םx9ҙ>Ugׯ>vX^?)uɾ>chZ/x>5}nrO_GmM/?g&siE#w q7J-ڲ TCU)SK%h eW]ۦº!ϛ$93s,f\@SaRԸ:!Mfn:HenJ!b[ǔ%ܺ/bn۱# `j`]ᚁuF0 D\\C^q5]2b\TtŴNŮ+6jRыtEB֙uŔF+mU)銁NFW2]2 F&+z0Z(4YW#ԕ1(TJb`NzVATe]QWN[\]Դ꛽*/jzJ]^Љ(1s, ,/ڭkpҾ_߶!}K1xSt5jW';ːFi׻BB?𪽞ѠvOvmQ6ZHګJu!ق7gߖ#Wr(B^FS~ӵM)'nIe*tʨUu"Oec,Xar}m tww9tY6{5'ld9ю7uʂNLՄkOl Yv\kY4.6gd]Ut[ih& LjUiZTP}2g}ľ(N4|]d?)6?=e`˖wP5/[_БةvKl)ꚮ4zcPCO1Կܝ϶I~FILgw-7}N]W|i__O?sV/\7'g֨r'T5)O;c%'ݟy&oO]6SJ7jl Vߜ}Yܝ'˪8~*UP33z&tor~(/Zf3rd!L^:TuDzdzE2HAޫE!cr Li[p=$3Y/\"Jer'u)(2!]brԕw*gD]WD "O8Nj) U6]9]O"0`*WJ5h@ (!ل>@W>M/6wN%+J+5.v]1˺t'+F+)ɺ4 'DBb`zaSjj.,KWTtŴ)Ϻ z)% &]1KiQǮ+4"GWcԕncn4p3-f4rzhr嘐 XJW'i5?eJƨizQ;E'_2 wac4QjiGiG+6">;ƕ `ZcSB1F]yR4UBI0B2Ӻ`TQttrZgTtŴ>)1}9;6 J`CU? jdqqJJg]=2zRAtŸƥ+bSͺRkwaǮ+"j֧ pA+u2v]1%f]QWव.!]S.]1.$wŴ)=f]PWiE2KƬhTjXucI~g6%* hs}x4'ssS.*a9rQ!Ql:筌kI֋SlĜb1ŶFfJjBLPuj~;o"(D:ϙW'i eJjyNHW!AnjkôbQ̃Ǩ+Ô3׊TtŴ>)st5J]!͇~늀dtŸƧ+6v]yė+رa*\z0ܡHѺtQȒAd]=%:p2!].]1n:bڡ# Du5B]i! 銀aaRӢ]WDib֙uu]0 LBb`i]WLr28F]Y;NdtŴ(YW#ԕ\0+׹TtŴ((uEGӉxG Y_6Kf"@ 2=`ꋭ,)d~H,ue6A&#ua|BT)w[;1Y`9b0h ,#C%O1%Aʈԕ!Xhԕ&P4PKePIRWoP])B8Ĕof Ƣ4`)+=k_WW~.>#WpP;桮XRW;/)!Ѩ+C.e+C-Օޠ"cE cQWZ2T63gQW1+M8ueȥ4ue$tue87JFb]F]ir9ƺ2R2TzdLLY,ʐKD,PiP)!7vɋBU*>Cʋ/J[X4L-BwD6!!/KC2tӟ!s7&8S\I=O c-x\b˃o`8-^OZ&9x(`e"ЂRZxRqhԕ&hԕTP)RZ[TWik82]q=TNk`@-gr:7kZ/+7G٭llyZNE]ƍyn\G9{>f9&^ɟzEAe'um1 Vbzaw&_0Jn-NVP<~?Yiy"Q17{^SMaSZ߫_F*]y]0wP0}L}U_jxŻ~UX/mc=f[ainsJcj. /4Ӌ\q,֚X֚S%c)9)+P,D bQWZ|E^> "^g}PW~ Q) suœz,("F]r_zXLz`ɷ؀=v:!9gY1_x[96FKn N9+gaS|x5#0pG'=|xu3Ks`c\#ӆ;Tfw6oevȑGe7ފwd%Wlu8zrI5 W'F7e^v?|ӻOg{ywV7YaϣWzܞ\f-ou=:'MȟzZ}5ʹ#s fɥ~Y^y_;y68߻׿AX9cSvz \;o {xpϧ孾{k~TԂrTuq*–*-5\ဈJnLX||Zh\]kb44TJM'%{,,%Lp' WqD%;@MVy8 tQZ|\WzP EYjWK9b}6/Њ !/ s䒢:nxÚ"ߺmsyά@a$?[tg č{%;g׳J4f񴨎宇%P;]K@}7 7~KIniw}3;֎`%WDrRkVrhÛ^; ׼]s 滼v`ʌkVr=AG  9Z3K9YM 0Ь.y=2ÙD{6; \-3}Kxn"_Za(s+Y~!UhgA=չ/g$hyL053°by렻uqOS)=@IƎT'64^ #\g@02YgO}MB]QUWI;>NHR㉑bpLR >Kё #6דBAv:> D <VCLo&y'[_ŤJpPh.D <51s#4G .)܈qO$<uzE+FoYU,or $Q 5FCrS7cL[_[ J8K`Bq^"0CQg Z.86y5 LHL0(Cb$R^{*$bxZ*{0*.i3sE#䏽" ɽG F|׼Q3NL +>w+}OL6ZҎiADw$N:O85{; {<?Z'{,O=棳GF~?X"۸r:/VU=Zi LG&D1:ǭ[vS8OBya׺-QwAГ}! Z%vsOSB%BhF24biͪB4v*b^2 |8gݕxUAh۳;YIZ^uWO͇ɟ7G4IŲe)oF%-*& ViɕMjmR+h!hSbˏ2_a԰j$n.q#,\THEI< YZ T1æ,Df'k^IL"Ed+$ByO ([X"s|82‹/|y~ItH9.a]vJpF{, 2`)ȱj| Nl|Qj-XWӛzewXxQ(E Rĵd`Z4QjR!LکM#pgYszY<ۯ~-/{8McRK>miizzNokcks#n}_ɪ"92 -Cyz;syڵr=CD.yw2 ٫dFX|;bdy.)LpF.,i:]YG41 o̪UgcD6C38A?>3 yvl*Ecq|$91RDBT^*q ]! +q1j#-A: .mVjQkCV[[9K@IHNkL)(DÂ1\B1U ɐ`QH#>tݩ>ۢ)*e| e)RQ4wBKq$*ݠTDzNl?1£q.מ.ctF@fNvFфy䔳TQ/t>DCA26(aŹ->ú^-Pf,E]Iq{\“~ O(m,c!5һ#k18`$8hѱDvŌJ$a1^pnBVVqr>tp^`4q'm<AF^PL1_J[ƀ@4&2Kb9$^xxɩdzeFȀ^XJ\d&1XDb Pǂ/!Roƀʊ_~*J1rKb$՞b›Y'hm 8&O'VeqL\m AvN"-%:l gȅV)')e$F' dme< _9`=/;؃j!TUGsy)3JKN A{8^"PjS% ɣB)15Fde9XRuه[JGd_(M0zM|=eXN{%a8VRH#YK+(k^6յ*ê:u,mYwCO4I'1X,޽c%勡ΏBt5'KM by-O(Ggc-(JJX4I&XShG]I eԔE^bɣJ5{%V㋻=O\̏ 6랗6ܑW_%؄r:M&L#˓Af{g_֐:D&%F{qT8-V`p8䌗(l*zQ;oك3&~ʭA[ȇ|:!K~q]9>Zi4{\jKRl4J _d :)ѕV-σr|X}x\_kUbx)V\?&#60׿olZ+fQ2`\["HT(ck㯮&ez@7 Vd Byܸl~KtIK+f**0*3I$&[b.ދXG tXV.v"Wp"v%NǷ!`y|s/B)Է^s#}ϋXI#E9 " ›wB|赞di$M1c=YyJ!}Cn8eSӬSoI'I^Hʷ6~kZoM2/AB#C+IOEYtѾH:zx^kAS/+g?ub+kS J`|T$~;Q3*D#i.t:C*S NБ n҅Н޹ͬs:%7:=9b5ʹW鯐`ŝGP TF[RTE'svr ?#FbxtZfu8NpmN݉h&R:1Wm"$t=AuS~\ЅF|4M`jV =$1h i^ɋcT=z똼DZe7.QyQWD ZN,l)A?c7ʵ  @Uh5{8\P;nKxklB?=8~|On0(V QFRĂY Z9ב\rG~ٿB-R!RdN hCH<:NE5HsznIK+LP1zq!(YMϞl C\ ݩ+NߤSCVfR˒#K8TA .+@I3uN5pՊZulCFgDV#ϽtZ\=ãHGe\ #1 ᬓMYg֝PJBk4_>yUrg9K"a&Q-s6)ZT[ I4+fd.w Z4cg[c}atCƂcYwEWGTIJ!y%IAE0 Fi+7 =M*wycix>Cl+V S a\Z1gY(щ0-9-A1 ^K ZzOP= 1*d+ wj4e't*xʘ mF)4PTRpZgX!#=D4`qY ,x!DFr%Ľ4=jH4[ 9$qˠм?R? 8pDͣbKޝ}_#N _9Nsv&,Hph1Y"c|X[5/_hCh"*-j-k\R"nk@6ڇlxDPvR 0qkDؼϊ12*8ё{3Cq#UܡC(bu px%St~ (Im0G4sB8sѻPFvv-++4Up[Z+96%9aPz1=y!x,;dhVhgV)0b0v˃q7?OJ~p|#+4_}TƘvvq*4aJDe'#P {1Ì$m :(#kۅ; iUj ȅDvڈΦڞv;賚C4&|1iUVdu󵕓н߃44 5N{ DVbSШƋujOH84QvaK[Ы$B]0U0y`O %8YnuS 4a7q ;ay%n``;peq3xLuƍL%>H9H8 oCb-Ty2\Xa&Y(kF="&ƶ`*chC>0 )H{P}^S;惯%%DE,F 5ւKaE|ar |%ЇSI.Oj+2ϜLXVr'޽.J/I7VFk)@pOGʏfX`G~GLTK^K&+ h}9 1yúFl)OJ>ef%P ̖\7-k\Yqedp=,jJB *$Vs2%뇘@7tE.(۬-|g pB,^ACmXʡj\ڍ&3W72vO[ָ 4"2JwqQ#I((NU<"9|#r1Ȏ#?W; "}wEӲ左~ gLfy; iQ4 yVfRK+[/1R54SN7䡒7e{x:~xx^.՟^ɆD]k0܊΍Lr8`hVu`REbpR V@dy<k(*YVTqVRLʚ-kx]ˎ+*"k  'p[ָNp:Y @-99 2 OʈF2nU4L96QxjwXP$-\$4K6 %pDq$fߟy!2MA,_ʣnw#aC8aq4Hw0譂6Wϱ/nؗt OgK'log 4uZָpLkr:\ttZָ(01 nEݍgjU'.tX🛢p[\ Qڤ ܻ;Ŭ8:b:|j/Dzkx5.DH'kPoM{6?RV4Y뢌FwڸANq!(Ek. bW7,(p iJSg=ɚ$t!v+8;".;l담XF @EOJ8w!stRju=ݍc')e@m+gX!%Bg>/b2owj{*<;﹭Mk{tq{5Q{ЇvMuo;[3BtJc.}&:\ŌT6'¡C)<(鯐`ŝs Cc2w%uS' ]hI S?|F7c_R1E^ jj?!Y PXŌI3,WmC^ ŁM=s 7=ݙߛW$#X>^[) noZSWƺ2j /&@pR&8[g,V`fc~}-*(2Y{lo;tQ eLϛ0[WG=֏xxYbmLaPc|?^ۿuf^rc]X@P7 K(%O~q4gI{8+`RJ d:%rآDnEb~} ͏jػ???ޯYL)0drS#*vФ|zmL!'*ț_'HFyOΞoD*_k=TI+՛G,k[-+XNvHuPLKh[GWj] !qގӜu{?t ZS{]96zX˅fWOiREi*7REwA#E/E٣ ُOl TiR*\`"nHBPV7 xOWZ_ą$IݘíB)ݔH%5m^SJФ[jWGP1NuW;ȎiM=9} 5<ǹEyA#=xٔ1'{>Ó_& .ߋ+ h#_f(՗[zܚ:x2%3V/`W,U˒tehw Tl_aD۠_.z5C{]qC¡QB/^,my: %́KB`~mv'@@U=W/P1IF#b*ARSmx7"0qo}۫D(A|qQxbbG[{.SσBQ%7;>^&ܜrtݽō7*|hN8$·`$-f-C /uc$:Y \iF*j&I}β)b=\6[c2ClNWdl Y $qhNq j_10_at!`j`f@YڻLL+~QgĽeR *f\b+x* [V[XIY5jrГoY.܇nD9MM Api% Ш()/c(ϬzLL9IR<`GJ<<7q%,IUFQhAEQh{M{B1P1j(b6filmhݣ]X=%_dm%, 8nqAM^Xlm(0/z4`Tǔo/҂P4,-)Vi߃t~/*rq:յ=hM$Q=r{14d{\Z O@PB 20OVRX0KyD0 Gǜ2oD% T!2̮D^jpTpC|PahuEYm1tB%|T`NQ6'|aṄ919!ZD2ރF_$ʈ<!` - IAE囱=8S`IrK7uK曯NB K/쏣o_|d}d}d}dUG_e5OZymg)*Gr<}h*ߧa&4,U"ry{4~>-WdU$%Y/ɪ%YZ^+Rnnئ?^m<Ҧ pK!~W!~WM$CPh5Ħ=NGRVSǃĊ(DcFXԁ"P[g}--)(dav=(>P^7s6#ЁN9Dy~SHdHCSMh=0ǼVaAt̓<kĄD&AFƁyAEy{o=(o*d˺7 Ɵ@m8y6%)V!h( 1ŘTJ4X/#ɕ\rP2)FfZy%Z~G\JN(B4<僥ٗ d}Y/,,J4+mn?ZXz\ S{~u ΐ="QFa!'b~!{11v4zhej^wd#d)L h?O&zhi: o*L{T5fCK! dAP%WAEDұ` a` a.*DFA_ U4Ni#z~01gN e*TIKGAEQz #`oFbǘy.S1;@͠j,\8$X}mF+۱j#Iu PwǣP<Mxr|qg۞f]`Nmw3 Q(+zXheٍYPk]gaAu1r÷ ƈmyyl.lbTL ( u&goF+p1B1tncN //&׳}׃&|hq`11Ivr5~:t͛N/fIFW3DKNo_5k]A@nUܖTIRH<: *&g4xsPVA0@vӾ՛IDŌ&wZR pps`\]^a-.(n]esnLUursUun4 S=ǀ% W2p.5eH1%qZ.$$\ЩDzH)\Z]Dae4k»4'fMJ #TA?Aq% 6i 'GnVO(䪈~xb~ɿBR*K*'Xǁ3:Z*P>T$L2Ԙ*jOm[m@}zԫܣ^*W&DEO"%tO6'G+=ʠ&<柘ANҶ"r0H>9Fs$s'- F'x@UL NJlyϣ4ݣ!Z6:?;傺<*vv;=;ja.rw6y5GWh|'X'wZ>%H}=T |H ʅgG\Z}ގKùgn?0)h1o^S%RWhs$(UwG$Axx-"~ Ax$WI'm,iD@( f$ =.)2€ 9L/y"46?^x2-5SFUi҂iE, 'P0'Dobc_8}K8SX[('^Żqvoy Y׫126:JH*A 0W)R"jN#Fo0Q1V8'(>ɪz3Pbӎ ske*.kA%iOIk?i -(%U#;@$׭]\Og7vvotT=_ 0-ȴ=;W]bnnݎG…EeI/ˋPsc7%[%0o,"`s3@f?I(AHZnB}P`\^\]X94!fU RZAqGko4/eJThDQ#`œVF|ϷF67<\P*dRȫ^iSixYq Cˀ%_匕oFc T~79:uDn' EK9'q_**ltW[4KkNzP &-H-'-R־F+ɍ  :wl|!TRR#OBe,U ޒiI6bR7n~4ǒ7?lύY?o]y`t5z$UTjIut/;RWQ)%"80lAcDE_~G~dnA[Q7ؼ-po˜o0L;ݲC ;R qQh'?}%m9”^I>wJ e @QB.IPTx֚m\mÕΞdΐ*|QkG?EZ1ABbcZ7(IN)&,5El)ZbBW"k {w!x7ԤI4V56H[JVTNt*nc-uV&N }|F0B`Y@Z8S,7:n)'NfoN䠌\f[RZu&EPNFREh>\"*;S"[.[wWM݊Rp!UιwW@30- h*Ѧ,; N>EU\~**Eޓ1GZsiz{!\FY`mdXkVyᯏ.@e1/F;q|yVj= LDF_L bb!0+Rt`QrHkxY6w%Ĉ<=[PHoȨdt1wV̎+G%2:S}l p$p'3%K~+ﰐpWDxk9  n WĄz_ȼ7V@(}AX^:54䊉K6|{6.Cq 2^`4I4e>-(݉LZ'_<~lc"f[5F?U=%i5_von1i2RY(ܹtDhT^?GvQSӛ1@; xP/L:0fvruUp5<,E@fI9b=ªU \Un<~MEpQMj'ZՌTm:dX}kɟڭn # [L0ֽQt e@abйpmNע|>2R?XesÚ7r]ͣio$wy γZ+&F~&l@>Y&fߎdu3mߎx_NoШq,&6[n bh9d[ 7ꢲ Lxx>{X qNb{b1(}_Γ`C' +QDOnAyngRj/S?Nc{ jw 7_&ꦓ5#Le.+^iz\vV㜴4{ܨ4|û[\^q,}{7V6AУo?D Q3M%2o a>Ϻ2"iuV~UrfQq!$?A 8j~|ǃ40>ᨺbV6I~YAc4JdfFYƥ.Jw!M1QB7ӰyAr;D9M pGpW􄹪GfΓ*6' ϗt'1=;ׯkmwdn[XsLJ HzvZ"gפC7= (V9;q_ qSx%`98ALQZ-xj:gB8wGY꾻q NJ?rP&+J<+8Z10`i &ťDAR&RIJdlqŮD=s~!lpZf=p4jsEx"2d2e6Ku`9!3$3QrAPb!}Ą'Kk;}ݪE9Dij8i9fԥ,- \آ7e?V-TƽT#krPƈ.r-oPEl-<ʼ߯g$-X՞&UJ׈ચ=aN.h%eD3@}—XQܩ1J|‡TJ#ލ xZ_G;bΟ$KgL4uUPMe{~:CM RQD=c+Qn,i7=cs}|^B4zp1|BxlVD+?8U>G2|#\{"ۆ&mXU-5uqn^μũ$mj F-s a߷̕y)xpy4 (., #m˟_)z}x K!V?(S !frۛ~ݪU'[ ӤzVa &4{.71roM4̞-G|5`ƌdƲPǿG @ h~whq0:?#9Ƀ=`n`oxh+e~ ο73;wN)4|{cx/>=r􈖹F`ҷ: ?_~nS\e\ͦ ɑ_HjR]F6k^Â{p ѳ} 0k/O^FOz֎Z|0mQ봟jAI϶x{k~a BL=E%DFg 0̰$(JA/CI5@5; ,{ZG. Hzl`V#X:pm9>RtsqH܇f5@L7o]e,9^uppjUHcӧޢזR[@avG䮪kE|pOE,湡SQsˠ" 2R<5Wp‰0C P,EZA8.RT"N'PWfo.U# ٽ.xmOBTZc(pSd` Q)ܹw"waa*Evc]wPgo۽*qlDbv;iTw9@O*6~m:wq^ݵP rql,jISֶ4,nR=dMJdvӽ$yz*yxZ7pV6)DьۤE7f[e(߱zilՅ,6[$Y~zIkMKpU>P,F+:/_/1WxsVwp _475MC '|;RA3kVkV4F5a>!K_|.&eZ4x/ _ w?a0?ο0Ťˆ/~hpd_:?}7ij\o88|Mͷ}0m?(gp̿>$k-3|)u>۪_a Oc oBχ1a9a&QϾpOa/=o9MKQeύGUGu?kr: .+t)n^XU>ԭ7qj%GTtٰ*q K =ʺhgNj偱P 9A}}W ȸqS@V]Ǣ4ė.v(SK]^aY ^%2ܯ7K]bTh f4=srhr[G|zc> ORIi`!Rw)S:cgâHkxY61BdRߕjŖI@z<6^Sqq;:՞SSwԤÕlkkf5\1nqs1m2fAb.,GQL p[q%ddT($h^q%BDfsk1(bٓf}}9MG? a:^c#/*;!4_A3GD:Uc JCW^*b1k3-ե1G+o?߀7Iko?} #zwI`xn2X+k˂*EfY@\QqBRh䥻&a[TZe9Әkg. N.}]>b<c.xEט([%7:Մ[p c :%G U,K Fˋ'M{eYq%FjyPuZgÒbHR*NqVʹYš;VLy.Ie΄c ef?־dhڸeh*譞l/sz m|ݷfp?Hs"W *NOaRB P֒d-b?`o GbA%%Ƀ6~wy;V}z۔ &d1/nV9zGþ2%p@(<|R|nTA;kP.Jtd,ϥ TCpՠe8_>cXJPj` L8't=k8sCx(wTM)w׾uކQ+${,O:\+aTbRd usnʴ&FS5K2X)DɖmIKcA}ZCc_U0-b½4>%s/Hm>ssq֤hۛ.Pi2D̔1`\؜1fGPjʲ'ic#[b׃m8PFnZ äX8ݷ~<ڠo.uX#%7=b .4} 'Qᱬ{y :X{3wxc?MgZqz.:EcV P. o 4宗bpCI;sx6I#ϲ(CR\dVL װ Q$pEnAY|X*] Qwwo6I{v"6?<9xܽFhpR^़)݂I㪎\6*){e- F![+3AFT`8~.Ai/Ge" m=jlh[5mwEtl54Brd[!^~54ޞίj鍠8~5ݠ#*C+&U:b}7V_Nܽ2xn2M1^Z%U-Gwᏻ tݑ#[MY筢"3n@!fA Y ;> kqỈyuXm+뼛m91ɻ;+T09O뎮۵+ p<"n)!Y"u4Zt@ )/k27d瞴Xj/ﱌ :Yj6.rbzZNay-" u$DίE޴w=\a4k븺XYњ~'/ցo!"/9 9D$U((3mrƿ.Gr*4\Ek*8hFҥV8e1:m@\}ʇ%Zߦ 9j͸?JBpd M~yW1-U Nr_$Ϳ*J2\Dk B]1'J%(E⏍ξ|n3aI$ߍ}GHߵ\/2Xs2Fq +۰*_Qbxv*c{TkU*<\p/_{wS*tRvyeii%!gw+ O=60҄Oe >a\kr3Zgce0;%ZP,+ri4POBx]<4&VDX<2R),j3u+&r/h.ǒ@o2eOMUMi63[QRQvftj&omry3~c8g!F0t R{Ggd\\% R<^f{je_omRp#'6M>u7*-+^ H7G(Ks8)fc2R+P{`ǽ.M_;;7]^hXq knCMVDJ-p46K-Mh\0n54Ml*SJجKQϛ!Z;p΁NRN6aBJ3/uK:4~s%CIփѠBfm\v)|)eɊN h2 .˔RYMeQ_t">{WnǦ̯|!oGH<#xcw 悥Vj"EZuԌz2Bps>|7OiCi͞L\sxʑ3q̦rS@PLѼ8wc<$^wg=L ξ^L <h|;^bϢa^4]-f+øt 3*v+ Γd:d6bQ,jؿ-11"jHTH$#JܛCOBm-d2}.r%K]_΋lFAcYµSd2\e,!e)B)KQ(]zYu>E=OR(ckMn~6D  '*5°xZnR=1w Wqe\\JW^=G%S[)GLmXDya SEdezZy1؆ 3DTr |Ԯi3̔VT dRˬ,)*Շ%%FtxVWh==Y:ܳT0JDguKj41;ϨZקcѠ:eZVܸD(ISfDlVJ<%3ADչ?jOgHN ? )Q{댏i\, J9Ck^M^^qݶ~14!+>>~Z *T3—=D )WR;NFZQLg#uY2]Q#6̸±G0 m}^)"x2T<4C3+sIO4ejcL0q}_I|4O>\KZI˜BYzdiO4ɉSn`C8d9sXڂ [0KDJ1msB%1HVew\[B %|cGFMGS۲'c /Όn7&LJ>BC^K qo9&B먺On!s|ލwz@yDWM% ];l.X;t_`xP3bFzGزbܠ(  =o=my1s7HW;^C7③(pRzNnxn|xe^M81'8^("@ɹ2M&}Z {0C2E43z}JֲPb&n;1q!(aU$-# 㘸7V7Yh37gM(<ts4c!,)_({K~J 3\z®POEE׭ABypY~{ O§CV|tPy0=l$H|t zgm<]b D'N{=h|j﵆klՈ] x]|٬ں4a|gZ.1f*JX U gL+@zuMXC'4Z{R"H !(3@Wa4d)I LF_%ߤ:ޒ\eeiBԨxW54^o&L`>A Úcu ʧoQFk^԰V2B/0)hS"n Y[\UHr Us.^5P]mkŋs-b] -ZWc#$܏+>X= B -o hNC(j:ouRBZ%` Qs|b-ߦ>k[8-k(/ :t:!NH'IRd%RVh0fup Tl[C+0*Y;% -7=gBP,H2#&fxXb˚=g<+RhlrcUM3+BPu&$Z|BQ,IۛR>zml>Fz " w Mъ74a1MRl&g sJ 1$TwL留 J]}(Ȭމwz蝸z'z($fUYBRdES!εϐAO$Eʠ.i iKU-ͷ:CH?\꾾:NװG;Hzn5|>>ϮQaud ~m61&z5?7{G.U |w\vq=/}ؘ&7l|oQ;l4OtK6= ⳌlǑAdor<r+]D&EDgp{:~ C5B\L[y}H'vbI)[AB{l+Ƅ H7\ fg~%X,"&G?Ko|~`#g-Y@q9!+eXhyRiC f5,jn<*O1)͹B8T!|hI݇Mm\Ɠ#vֺpm1h03*mf)ZZj lf1{d|1|!WPoq>5 ު_TJ u~Ovf|ƻN|Ys"R9B1&=c YՎǤDgNK 5u(3uTd^9ܒ`]E|<2~~c5]&J?a̙YAg߃vr0z o4g3 $r޷@uxYj*@.&BRU*@&,W} nm&7j01%۫: dk/e_T=%l^@Q4o[2`uf0!vs-y bcY9.J 2+XlMi@6'}g)q^kCHmB:6jbftE mla8Yƚ]Z֏Ml+^QXxKtcn:⛜m9:T󐜫z%5~2T0kdc*xZ'!Ycn|B~)J w#Yn ȉT]tpi*l`<2P7ùe x]?YS>cXd= ]PTvg(RwlP)Eg;6hQeWR{{:URIGV|Ape}M%/)ݠRAԗ4*a8ՅҥSu]-+}SHՅHShxV nӯ_߾8Rκ,^MٛT7bW'}着BVlrmҨblJ=bqJjTJwT|Nm&Ʒ0Uu7bQ#+tQGA:&@j=I2׏Rq:^Ser,Pjd0 B<[rɗ%"jV-,G'&Vc4@ܤ@*]xATȫd_ 2c|i֑hx0ȽUPVSE"QJ=cl1UGUaEJ*t+JVWS.dl`N]Xb馳QcFwA4|RɆڶ3GBSM1 Կg+8llm|p]qw{Rn,{Eu{ɥLjUDQJ10ƬS̽7ۡ =zm>Z]0ʤsR[C3ԋr{Vsn@vBErfA5[UAD !ڕKDR#ʒWPLT{)hQ$I͗+0a} Kԣ̗> <;/E:bO`P.D@K'/l\Sz~c7=fIS9PPknW jejĭ =Q(PU}L7wN)E"PT orb*cf ڗj#G"_ ݬEb*@MԪ(`v/{IVdX($V{F M bGdtـ.uG'hF]F^qXV2}\QO3z#mQO~s`{Rw8pOq}Hr3EBM4Cl֩6 Oך c *#b]۝< OD{ݏ.J-i>IYonq:W{Q>:|a}U2Ele7./%%ٝ#pՠ?g<}uoTy/^8;?X/@)›t?x,MomǓ/wwfAr- :o={;goϑ([L>˩Xlx,k~U:ؘG;xx_poĽg|z4:KaN&+*Iġ㙗}eF\ Rࢯ"BZX%;H$ZE{UI!%_92oW#n: d%LyJ!so4Rܛbf?zRewEGlSF8]GMGBu:DB#f+=dyDI-^8jwcU.>BmA䋰_>ڗ7uV ãBx>mLJd_. a"YK}ol%ӏ'/1{srVgrVglDbz4V_&+̙WIe6YsO,Rkj96Q>;!*1GOy[ʶ-l­cso4+:㑎Dlh2:_VVZsYttqz"5fP8TEΊkRmࠀh`&PD8n-so4K:ޚOQwqwV;ۨL&!FՑ:2skyx*uq +l1rSͩUP$m#vtϨzFU%FM7k[7 bKT7)k: kbq#-Ix?AYqT.~B!g8q$pdVr^BwH4q鼶d7bp5_ɟOM/!mIuR79|9$y@{ɚ+C"\'I^S9 GawwHr333lT}vu80kW~`=7s~<ڡjoPUG &+ W)+QʫaVk-cN).(:Elm>G8%%1ch30tpܧlEIn.ξ@ GH6ס/mŭqXujap ]CsT*IE?5b#eb_<5iZO$(eoa[R-{ ~`D)zRy$C񲵰l-ܡTHd[ x٭?= +6h72YCGWhnD8H`h4.˖kE\vhձ˭{LH tuTqJZs\-Ȍ_wFpCzG!MaXk1{9&SA0֏.7[`?T΋MS=0q3Uwdژ%+iM<,PjNA$jpISZ;4h)P%cޑ+Gp/m[jgG-_qN F c IjfTŌjfV{33Az#3j]VST O#[_t62u/k[ȭwGRyPo}+%Eq9JV'sta"5쮮D"sH HFk; y78g;DS'arRD,f)t#g >xM:0Fi01j4d R=$aYfi傡\qnJ2u\j1M%W}0x> tL{gL(6b3w흱(geU2g1lWE7޵N2ӖD?/7r^LpFe$Ch0^$c|sdG૭ZʥRʭd)tGŕ2;q!2lY]YeԌ[kgkw&`U!H!n-H0h>}FJ03toT7#w"_me- !rm#s!"I`ٶU2)׳o)Rß[$ f\? @ )Gv K=MP>S`mhW YX3UgC++P;?1Xn{wrg )z,Ʋ˲T 7 psPK|jim_R]ͭZ c)+uu =+ݧD1ht,ղڊtmDgZ0o6CB9\VI+T"4.jt޾,Y'X5 p4a'p`(W u^+qK{PY !1P21uc"!y PPCsMX7wu3Ja='P+&3Ca&1-&ssɋ)&ә2BpǏ =gL'"p^PAxz\YBzHɮ.y9J6[,U{@!\dׂwuR3ٯ~f_颐+oImc8d>;>Cn.y18$JCS(Fʉ39M ' x"Ma ݼ(DV5k X+EOշ*ܘR qlC9~uKe #;~ &׼ dx6CI,Ux)m/~ğ|7!k%Vclbs[sOhgAfhG;]bP;ZUJs<7*0ɻFlj]ϓŮi[MjVSf2HwmsU ^Ν}O+ E4uj _ZR(x54id M)2[$ꅻGEbo% {A`Y"9l)WmfUu~<)X~2Ҟ*U#EIRgU%'}Y#L ꒓sQk%IDc(0:R&_椺jC}6&CpG|TzZX ;ђJA՝ jxsVw:qA^tZ :&qOmꄜc@妛K^L&֦qlv jzlV x:k\J&)j~ETKX/~Ъ.Ẁ @M2E P^>aXm\!mB c&I1IMv< 4݇VN+x7ۢ효\٧YFSvD2,(Y%ft&Սwb)^ڢ?}p-MA]|M龺I9}5-vGe%cmL(CO+T(e%R\]R.Wu*H{vzK=a:tsɋS`L9G %V<Hy#65zu-_wAqwe_W=l:'xAl>ػ\|r0(mc;f%'>;aQvK8}fA"0Zlj%/`D*-Nk{/{&+[SLaIO VDb(Utp)!Gא{jp_T2yJB$a cRapE;kx[v oWF@qil3}6#nSK =tĝl ]936DGiLA#vcoۄvFM=A;,Req<™Aڅ/P$O|tiiG%_xWLO?* `uuŕWWWXOo2cyW /`-~r? sK{oSo?~@:T}{4k 5-/5O:|//Ԟen$II^XftxO[ =Cles%;K-oNHWIYP=cĬ*z扥8F6wy`Uh7kޠݥTfzQfN-١jvoߢQqhi3/~/^f m#K)mkvdy홹B*.}3\6hAN }@K'[:d$X6hA/*x7g_0TR_yVdm@u &،fh#ӊ˺vJqBUmc]|V!6QWk-# iF~GȔu66ߧU(tʶoy<`kv4_'YSUTcd˩R_{/z<%:@:6\}o6 e}\x8hHgnw7`PSn\9Xٿ莄5Z G*zSp - ]5h2l]E{Q`FbԚ=^̫$V yS9HuTqZ@cX'z[ud)a9OQnE cmpͱ[PTL2 J,Bv S,7se"foGAWEN4sscs>_L؜Sl56_d{ 6=+cV鹙|S/ah/X)0]q1—Q]3I$M][~h)Wq )FGDp Kk6P)# Iz mz,ڊ䉄I:k|5Jtmvז3ȰR%΁؃إrl~ ɥ"R(V3t꧹s1/Wȼ{_^>YH,XḪM&_F[k)E(Z{| _~@`8M0ql pUтa \w%L3N T眧NW3b0*#dy TFI; .uf 94 {F2 Y {j(=J#d_#>ǷӇw-Źz|Xx&-n>ݲa[@7u.Uŏ_UqQ#y;lS4=sóN`^p==&=bƃ:mOm=w>q@̃XEÌ:a_6إQJƂO"xЅ -)8 r;ǶiBku|v Hʫ?2S__D;Y!RToHE0a]Q'TuS*WQv&sZPuhVCT✕Cr.ێ&bԁi͗ϟ?ߒpE:t5ASVXjTRLCtr@tiߪ}VH>غK#\hE{Q_ǁt)'%! ~B~½~]Y4+\dSԥ,΃}}ߑVܩ*RQ+IB6Nnt_ ݠ{ix{jBE- ov.|Ѥ|ۜ 3nZ*Kmph0=άz5ug+#twn_}O;?Z!X6%Rқ7՛о5x ܷ-0!q=L>X).CIo}|?%?;ze& ҸnBKlRJ%+p]nIݺ&"p| F1l]D͛%jdgUd23).9Ҙ?m~ޮ|j&.c/ 8eTtbLvӵ6U?T][pGk=:)_\n9ltw=䙴"NqHu[\Puu w˄@{Ĵ:j;{(zwՈgcEZIN!u@oEk1f.ъqARnQup)rf(E Hn)[&TRinu~[q~bRRnzޛ=4x&P">5l<31Ch])6\ 2wM`A:*҈yճTtCfX|&1x3obׂܽ&Yudfz4%FK. y>EN84۸@{qq }3Q\xҸ|FpSD?f^soӊ6eӴC| u= T:(=f4. :q1sVdXx%0H480Q \qxguwP%07yK`f#g>mJuS ‰]J`D9nlRɜWc:Y`0`o/o(07b1Ӥ?1*X#==zE0gƈ(^=PY 4p] ŕEZbd'`d{1)rI<(S G&5Źm&%c<>UMF-KR( y>%gskAZ-KLmY+\ٟJpGzȳ?J\R[q٩5n{MN:n@Ve %Z\JŪn}Fp%Z[9|S7}i2˧'Q 5<*u4sbD RGey-5ScS_ϥsS P5Ni3FG%K7p" (᨞N:tN߆91Q#πpB|KPGI"#Toa0X)ԮP;/ZH]YEmPȼ?HHt.w;kQ{FE2EcYxχ֤;_PV:%~޵e4}hҤX|S^>_6>Agj,h~Ʋ{3ZL6bM]UeQ\Wy*_Kgo9B>cqece*J%̤FA"(o{#UcOC @9s. Gmalbvam ^\c8Ww1M v?ߨ`D?u+_x}8VW'Vrs&jKUQb v֠%0*"hQ:n$ؽپw璇XTLʩ-U*3eS.w1oΆ.>uwQՈtpKRSX.R]/PvIT`U*UDf'@1Ze'P|=qjjpRR:zY RP;=Xj>j`.Q- Ά/ 㛚jĩ褎7qżB ˆcv0\J%ЏPbم9Q\tP j7)ne?wg/6+[PiD˗W?Lվ}*!6!Rl[So}:/?_NN-\C0zput1V}[* pR]i#8#]G.:M:9j5AlеuSQeR7 ]߶]CV>Qk06`-{CEوy!.xXeDxI_a&b|-sY}z2] 7XjtvI2iUCߺС.%T@T+ufK㦫C:v8 oH_vjjMҊ>%ӺŶgF_e d;KM)V1|RKjsJG֦B*]4}߃nLu!uf^5ȝMuqD+Uc>s=aif]glg۷1``ʢu N ɳݾI;o||~ ZEky>>6>Gk,k-eL\Z#;\i(X#U@i(EHb UHTGA=MH{t5Q=,QqơCj[#v%a JE,dU98utBr-SvOک1)Q;7ȓ8HSȇfHc9crR!†TX&:ٝë8N:7Sˏ:W?>Lt2r+_7رN@ǡF b^o֦MO CoVu H hf_N\UL em ݗtvU!&/+1W2h8s~(N呵Sx2!Zg%Ш'R$ⱄRGJy]XuU_%7%'TѝQ„y9yR]L8%oZUK[M`xY}h돿nŸ5Շt"+=Zkw+;[?߃{% ӯ뾬V%}Og]umn9;K6m6__p~q޷_֕M-+] WȦuH}:y:^9P} u~oq5meۦ8tW=;ªt eP4vFl9ChC$`Bb@ WZ8|ch멶W^R|)X,YnNe,>KYrRYb,>" bi|g)qqcs[V۪S4cn9Q#^gwُjsP|]p^{8[#LBŚ{noܡjnORBj7jPVSm!}f<%i wĕƵv3X(Q!)2}$R7H.,}>|炃tzr=}L!Kuō0D&72iEPjA@3N hɍVo\Q"NC'p.ۃư 0\t-}՗E8&T8z(E<8΅`Sx)='j&@s/EfJDV9' EQ`%vC57@C  ιutB%z VJQiN{3u' Z*/87CVy(`T0sL0 c|\o&jwN+cc8Uug$~)R)D>PDڡ:HW`T(d&QF361zgbvPRSbvyR 5%f&LCav* J[aN#PYvCtC|)dhTnT[U*ݩ*Uq NU)@#ݐ[jZy$NJD5ހV)ͣG5'*G5Ct )͙&PW@PT GMt$S;TbhAI.T*QGi)8O@CDV`.fSmTM*&͓|vN혎%6OXeYr0',y{`VA8fE~Wl:x)~0ڷ#3|Y\帥VbZ.fv}be*Ɍg>O.((pk ]-e [UռذjGliu^6N_xwll2!a3s(APAg&߾ђH*?YK*Fop!--Ƶvj\>t[?ʼnc?rlZC_8a*hD~Po: #q \a *o׽0_xctg%&1-;(&bWDȭ#o G\fn\\I>dU~h͟*8AuHg٠ L֤/#)}19q6!8Gb 8=f:3Jɇ|8Oh%fRn]< Yd>po|?1, 'O <2?3 6ϊQ a9^--=N޵aEM\<1>z23[T!Z5WӏVMoVdN W0([aOSR>=}/&V'RdͺFE9k {_ϬP_gg/'EY=IrݥCmC( YxEf3 9HXK\RD*53._!~u> ݶDhu$$*X*-'4~;Z-Zl-\ުN2mj<оJN$vk'34T#^W.^]A\*ϽFVpg=IQ-=Z ;qy +g_@X@;\9ln'6RԆwެzNv lTR _2Gvovi4Q>j@1'$j>3$V7*D_N5d>Sk[ ]) b)4nM)1D? л>?vkPL؉}0Z˽'z4!CL](MHNA;bO {ʀS)S"-U|]hU||&0˶1qgĈ~( Dpmxl4+og.͏ HQmVepFʩ3ՏI-o€pײQ* $ukv0H)ܕ]58gL7Їg8ݵJ!#F5rv8!ԓ;:jKV\+.N)\!Tc@\ ħxxV= C'M$/npk1R<`.7/l;h8=R- v׏b"t%WvfO7A|7Ysrڨ"P+k7ӆ+pv@*2}~mq;[2BbH#5>N)N8 ;?~>C7WWQϏCW\,Zel5wӑ2 >5X- -YȂ/"v4J(-ksOjWnѳH"o: %A9NL)3!QIEa,UQb ]v7lҕqJ #\EQ#EmxW.Aa^͚Za)ؽ섔(!^wBj4p"ҡ.nssL1E Q2]G{a;ѡhz8aГX tx2 3`sfL&kEQ7┎%tnH `H N/91Bj"E喘8̎]Y, ϷȤݨ{IL#)u /lHɤ2 8MN2q \* x,Dg#TUⓍ*T^03`fM-̤S:Wb@ r.6jJ:cBzٻ\ScKӼg+{.jfux#y h騏^8D~yM)S[P ) 82XIuL{ Cڭ@ K2SZDR OI/oV-YUT\F-2%[* G&WެzWoK-H n3&V:nt{ paUlwQƼtLHf۫:ɽ%@plpkثa旸. cG?V֢e|߶&w^ fڝw1Lu5LUh_^Q/KEؖ=q[pmƉP|<8A]eyx5 qerZK/^ŷ/'|TF, G "E8;TCfhxLxEǗމyVMi&H{O_JT]~Q>x1c2vaF5nS,:Mnl6,v-}a)nj|KQ~ɳC4f V6^-4 މxQAV,`X(~a4l߿/7o2 K}R E"~~I!c ,_ndwK7ZNMV߸c9ɱэr!8W/*`qsI#"Ak:R#_q$Kqq;9DsgsO=F*u`OLe16GD()"-4{s^[Œqre(6P~a/ .bDTjuxwv,F]tܹɟ >tDW0ł am_ߌ[xeoou_n#"|>bw4\|]{mtFh)3Bq?{WHr /[R)ǃz1{⅐g:zD17HIEUbG̸LdJO&}RTi0Jަ:-X֩ƴAI)uJ-$N)L{%X*t#pi!Qґ!GENW)AFpc6!Y&ҳU})סձBJ.lDŎ [ fj )"N{Lr7_5׊\F@~}b[: ]i jW(A_ʤ'jV̯}EBJ(Jfor5_`Q)8kOor(WxC*®w3{?WԗDgzuHӄ36vzym[!؜ψBuF92i{gݞk)[+;kµ$ 7s,Ѩs!".zJz>+r^6btkqZęDsQLϙ,Wb9(4iD9>GlC\ hpf=P[;pm6Q+y+}pQG|pE)3(\۽*_$j@'R[r^A鮸_ Rїw• v ~?H=(" ^2 B7r6;?tVkZ6{?LbtF3J~ImLZ&o 2Eg6MkhۻiivD 2PJG'%Mr5GR!0p;O' y6tB}2k_8gm_Fm!Q|"ZmSZ$ws=|7m}g$ 쵫Lr"F{8%V5!  4F iVLLj@.׻e^yŎmKT{|T4ͧD.*X|7oήDۘtuUŏ]`\_&<5 șz?iY˿\E0yqr}uIReֵCM\x~i5%Y[Xh?!IO.͇,WN(ɕۍ\$Բi0i,wć&> f3%u={KH\rI%dYoНfXԍsqwD&VJJM<"UBBF 2#id }B ;RXfDZ W5h|r>`9/'&cBcF`1izI/.ECuҗH$)!ٖd"8)FKH甮h/.5FE15ʴ%iD߈ъ0œǥDPl rl~ZrjBǠsUxL10Bp m؊4,.1mӆi㲹qIB}nF<0!Wx[O^v^da[EP3>kӢ!,RW&V W#E}{E˜ H׽rq*|Ov(9xy:?n mUYJ|Ցvy/?]-}R߳ x.Ði{nBo;ȓt?g~& v67[ڴ'0?\y1'?gQ%݃$ד˷RYZ"eZڂyK (ZH;1]r~)TqN hءDm( vu~솢\h!5Qz:5=}+l>]j53TɆvXN:waFL9I1QMP\qINtۡNISm;UN񁒇~`G . qt]?w9}4nzBPX@(c2FK]ϣrGz}zf-joTbsQD.?)D,R͠bA\M e/e" w;{((ic99 5/2[:IɅ&M%zbQnAg=PH$ ] RJ]IB%ߒuSiG}\I!W5lxQp;s.r(9r((J!S\[jDs%`g砌LCpN[i-!TbP퍪B7ߜyDj!5&R SHS;XWUepӣhҊZѝԔ8 Afr)Lo qFKF2KڜO"ku2+*@UZRS'I4!)Qr6T(7bm#Y!$pl2\W*ϞNٚʸBciFDqD`Q5뗪܅59@l٣{0!WMrY]2Ur7%+,pbnB\Ѣee.nezl>#At]8iruWG',Eik$#|ͣs KX]܌8]-4öl&uu|śM~j uY179]3SYC3JMd-GVpqNx(w?a)Ǧ\uǽu)c08:\i*|%qcEMcStcsc-Gm|lҌѺaìhյow ?% }:@$!Հlς ʇfں٪U:*\ )[GBsjb6Iuy}6It(M2xPObL9j]hJڇQ2( ʥkbpDh$^b%ǿSqJYerv部MSkgW3n17>Aع !wD|!A$fR=Q[JI(j&VS }LGMNYNfgޕaI" K$U$@C"I;Q8"JS*nu2$d0!2 vR!&gOл;6{?~烡w,F/Lj[Mם3z$DG1ď>v^M: }Qoǻo3͒`{aMViۥiN2M~@(rMʞB@L3Aeqݝd|r/M|Η!|=9Ѓ/c#I௦{*|buD"ʃaԫ:WURߋt砃(!V&2u2U2ԋ("iaǹrJ^_&jٙX)< p,ӓcJ(KNP~GiLeՍ,4"GҤ}`&Kͯ h+"ŴŵQF|W÷w4|9rpC8=NW7r;rɀϸ??kXw=+Wd[7 dH|ĽKT*A P/g÷Ӹw&`H,{* ߸*Xpc݋`vӅg*nu2dꁝfRt"e9sƅiagǮ3kϑ7B0X]5r,)K˃Ir}koBB#NK-Vhizm*(!0. PRFK>e(ݢW#>;ⓗ{~G|@w;"-lMi38Pv!Z%C= v?PºCf$*nL:<x!ZXHsQ#¹P6n]цs(LHpqMӉûg!ĥ:*:6}_BJexe|f+ 657iۜlNRJf:$51uM(.`)ݜ5'%E(E!K- 0)R@֣6F\s[N*(!8e'|( "D>ߪJqj X4$joAo42pjNsKn,A`R&r)G7yq:ɶݔ%Pϯv( Mgk@ka*DĺSWgϟ7 I"8__ղ+侑X U /Oda?^ >۷LwÝȜrY,V"6Tb.{N^ 'g 0Cn +@)dzQpp!vF7E (g+0Cu+T/ Ry8%~A<|aXc4k& aEX?kdf<6Xw3 B n{ocC~Zq{NrU]k/< OD5m35iox ߾tZ򩝨.y{LZ=$vu7'$OXFf$e1HS#%"S2chMk B]yA5ŻvyyCav8?.0a.-0?zrmZ`T]KhmjL\=|q󺪚Ico .]q={fM?Z%vTMOC;l:] BZϟ9QS=j evhmV3`_/TdpkZ-1̸jS 4k/Wͥ>]xś$y`WҁkvA"pZE^YN8guC' U!ZM$S!`jU ̭@! zM^Gt:f0.vHJ,rJ,$v%Ig҉R.e1J[h QS% 0,g!'zWD}'‡I<1>)5f0 Ǹ޼՟=/ʕs fT%XQbS+.=/ރvu&?Z2DCץ̨<$΀pL5Ui+~'}˕%,kqNڄ\glz9 F;YMrgP|O>ǣ^796}BChi _(DΈ[^z]g͍ٙ۟AS_G'% 2No(+UHtn1TݕOF5Mq׼s  8^_>ÓL#vhp$އ93 sЌ&,%d^t]21t`>ّqgDs.cLv>O"iP*蕑R (bĚ.ZxG ޮ7v5;s&B R#8p`\Ey| BP jIVCFUD a+.@ZhD*Fjv>H9,g CSMPXb$)Bͭ,eO:YNr`.;{_ #cf0R܌']kA7:uǙ708穵'$'[pdx'VGCt]צsd*.0QH.eρ=^xL w'l?Ͷ&f:MAFmGzpt#?ɺy޾iQziи3?<=v==<9bx]5|/hywt|z0@>o|v?\2>{Yje;(sݹҋ A.e?PZA7\n]ۃw[Μ[/o3wvtg-;Gwi6y }=uU|t'@tI<)T,9g/_~Qu?(>i>/WCߏq>{/!}7f0qK!WH,;zYS/&%m; /JqO0h0ey`Tpwo Dï |r!?ͦ2pzٻU?S.#ss>{/r̸UK,sȦ*̴V+;w8:^ɫ.蓝 {1IkGt`#3)dqF1*>KЌ;;]a&taE1\=HONdG&.3fmt|Ag;t!4yr@{Óg|iaχ'+Bfpy_!RL 'WW륒jr sן֨4'E1qGJǥB2EoC3W9P9Y ,~!B|<]vB!D/ޑZdEw"[-""q'.(iJDq5ixUŒ2"3ɭ"YEc./@l!QS"KщצvlC^Z-"[-2yrkTH&.iˈEqBI'b; z iU_d9s1YQC01ĨޣO>8 G)|{4$R6jHRXh[~ <(h)Wdl0&-1/sOB0M+ewVRQJ(V8B)rE·yB!+'L1cFY 'RĜqKA.u"*f KQ!#i-474hRP:IQK%8plI @Mj4LSH[ #!szeJMIMd(.qx2~恱>ET>ż`QPT$ڦh\tx Fuu cD`XM"['io-:9_h|#QC}epxcDZⶰpB^V~aun.}NSsܿ+朝^m=ٸl9 Fg郝[:y$;p&GӦ8iAgVqFTcҲDԉi"$ǂQ2 C_H=3k3@Dkjj4u_D"I0*&:;Np%m,IB:c!v9K3N f@;&qҳkEp]Ep]EfD*:KEi Os YJhcREz$VN-GkqnųT: w|*qe;ˑ6V/gٙ95o4ݴrsP{CA l,tbsl nkZꧾ~Z^QSg-8u4Ն%4a R8Vx%q8G6TG6.j=[RֳzlzZV*1%LZdD&q,"( aeM<[bֳuc"Bv1!$5f`l*Q#43o76NLPujOQ)X4bĚǮhXFUC9b]08Pl0i$*v.Q($ceRpmSJ6O(S9՘ Koĩ,o9"y!E] 1 Z#C1"2].~iKG%n]0K{Lp2(}5 }_CzA^IT:9cЮ,lO25@ Lh'F=d4ҝ`=0Gj|e$!ITZmP^$54;LӁ#oyi=QglR h;LM]4jѦQh>S>{>L x?F^=`d(d+S  G-v4)eJ`dYRr¡(ho qV&ɛMߛQΥ]\ݧ[r.HG^c]9rڛ^x4.W!nӌtspg7o߼F[<󿞽 w97'~Y۔ ofuۻ1JiemHۂYߝ=*cOSe2M96 WP%%܊} }ԆŻ2Vo^jGKih @9behMu'tQ0u~]e[~=~^Wτ!:^NxblIm}1V24nXApg=D'`'?6Le`=4Ǣb%5:^]/32.ӈ x[8[˱_FR'Q1&ef}bjc&-߾f8Y#QwkhmA,oUS7mRY\ܧ'Pz,WO #><oվ?Rwww?_+/QZ )okIHӓ_Xg?8ݍԣ[˟XJFɊ8qWϵ^=lj ȃvUyVT\JNr>5~{Uz%=gzA7Qν`|Fi6T^z'DdDb=$ m O~s񪻘D1tivoRX):U {?~bW˟H|ٲq(kY+_M|)$ǛwJ_y}l#{,#P(Wem(;1|cA) ' TgpXx@52sh4qCOuwShiђxpkōnfK'5 *E%j>g.@$s C /df4x_ᤆ8WkW`غB9Z'*9rd;DuɿDzJ|S( Zy it+9f*3 5oJ|]zX(n9HBQ= B^YzYdyYcU__&WY<̮ooΛ˸Z=8]/\7W//!TX{{Ui~ OP6Fߊ-0Hf{]DݸfT#O)dԝK "ɳS9bXAqd uOf3D0T%#spOG y97 ͇,!ruSy ԁJ#aA2z_hW<GV2)Q#S@m2B+^V9 d @\y,=+o'c ˴&ָpFm6DOuDKM!_|AR[]4Wg+VK~X ܢf~'v\3%J쀦.s u:f$ce>7XN ;ڤp mz-\մ|Dk&o?>G\D .ϝ|yI;rHMz[~EyC'Y[XzSW&_Ck;w?ff#^}gFr۵;0EA#>^y1H`X1B2TT*`*#͋GC _M(3*3OK=* F4.goO*R3D-'`(Sa+!SnpFidޔ{4'TPDukd $:>euE-(2JI9Gqnsi<@ 6 $cΉ9 #};OȜd@l[^`D#uȦaR8-x8*jxn{Q1@Yֺ䍔6I+1Ȩ)1~ǓQn'rXT[B"wJ|fG#!߭ iHu&qF`ZeIxʳO8e' iI8 I#Zj4 O{a]o/S ULɃe!F^&.,җe23a0;tEy v- D-6F*f;Mm.uզ$9g5J|h}Ϡ ^f+# J{ e4gA0V,PD +I tY-/t|O$3pr=궫M =LiZ&+x^*|‡@qG#[|U{΋F+sE|0mFI u,yD3ɥQɊr9"ӺhE\($ F""-Юڅ&jZ |AAB sdîuZira8PKsm.$lD1&+6? YVmWȲu%tfaY8 ᲪNPQkA?' 6Ʉxo E՝1<@4DN Vzf@o"7̊dӆF]@,zC m.a`2٢zzڮ&I'F _ˠ] y,ut{~@WW]ޟ6 RbKin|s]>uI8wfNݜsz#|<2%3 KlG}CCOm ={kǛF`BΗG`,1ƃf9 @ȏ'l,'ȩT*;nii"!vX)Dd8'$ċUςˠ]coL/ddg+;ƜƊ AyJ@3b1lsg2(+ɺڵ{>#}u?s-)GH0ֳhUۘr*WS&e_OCNj#I8+J~ vn %Hʉ!%gCZaH$5~ 1};a4<$D}?ͧ+,tΒ.}UH)$LN;\J^¢{Jt'o]Jm=N[[:̮FO"W7OO\_0Kcu 2y+d,#g kk84fPq#zkK ~G߶8kdD'owDח:ڿ"vmf"`\ :]V ]뼽bYYm:o}_wlBB]֤n՝LUW G6V&X^9`Dj*QH 6To+N A#Ht~7H󫘝gyg/)J[ݹh,$wwqЁET9qٕ%;DzeWrWih$x,l 2( ":B:=68GƎM!U$gQi2ĕ37u7T^%M#^kĘC1KrBT_~ܥ(N.@:Pn m{p/KvZAsq,]3GQIg5"($GĚ. j,!Jj/CRWP/n*JE F\ y:兽S=9'gl4gGEjZggg^h~bۺ>'*nV%RtZI:׻|}yѤk {k(Ŭ1j xCEKUݭ7\|UUyW5AKʊ׹|U{];_s7+_UQf_b;j.U:W{LpPKմu*k4BRA<{H># K1uV6uWqK]6Ųٯ7ļG,r龵%쭱Yf&~U/tj]]-B,՝\WDK Cq_Wbq†&ccǭb(*07VFdh˄[>T+kDA1!or !ō6q:J 䌧)+GX 9j RHHqza{Hz Td_V4R[VlU bY@"߄\Ei2=fxa4^F#(=_ q$YKmv0_+˹('Ihqj&o.lOf:kkFx;Y{AoFcd!F ,%i(d |dckLf-1B{}sm{ߚ0;ef^>a fuCN1CHT,|MPFHSboS&KB$9\,EH)KXue&5L}#MXg\k㉻_k rѡ(vy "F lG㮣;ruWֻ|-HcAydho0GG3GL+Fm蝏y!3V1Q_M)ӧ YνNB{n[ 4a& U4ET%f=Veͺ1XTQg4ns:#'nɌֵn94+W$j[Csݺ1u[ N3XCF|dFZNɚ:%O|2@a:uvm;1ƵeYE4qwҼ_lxcXgsc@O'weC2qtwbH޳WoH2EP; `txHqƐPpvJ `KIJ,lr&$Yz]$AX"$xe  O%ɣCkS|+$ Ɉ W{6@"GI))w8NNՅC'ya!LbF @G3Zr0"!JF$i(yY%HhB#"~1HnZ8J|% $4k7w<?N k47?ebhB+J;$E9hO?V1m#ZrGB wVׇX8n27R6D[KG։KeV07N+à zܣrR5lKMMayNImQ`[#Aj21j nMq ' -h]KjP΃.d<϶hєZ) ވ4)t{f4hQZD4* ﰦ9sjH?L}J'G`w c4<6TZ ?T (̸„'4*Hw9>(஦_y0@r",5̓SH(("E&Tʼҗa 8 [* (Cd(^JXǬ~[YRPptX 8%㢛}!@$Y)r6) w8nցXP';yI%/? k ̻!5pKB24 :(KA&_vnךALتǝ:ɍi:F4R#cj~7J3jwpc=m樬b6ƴPxW7)Lg7t}UHasu>DǙ2CMpY-[R /hm덮TgYYZ;{dD\<3 {~]e? ="9(8g kdv=2yzS!~X|=(F1`[>T0!do`PeI#H,yV}vM0T\0I:E cQiY- J8Ix<_LܕinKcQ&MlnhZ%ɺ Y! ,l/fk hfMb*-^x`h/0,fc3=)fGGܸ_"zQwoKxpFӾPu{eX]KMt~7swlEkPTؘ9.ܼ۫)nGab t8).OOS&ṷsd< (/rDbLi/4T9ySj}^eqﻇ@S4- ҮV˨GeAyX=y:/V5 }v-/U]9e,<,/cX}F(?'x`~2@)X| g%lb@ζɳXAͨ7wUSEO1 o^ JY٤l_Ku} s3{b99 "mFaJ_O!Yϗ_?/,^|%/,~EK>7؟|-6zKY)m,>{p* a @O)pp^/nϸ߾@i?os0dYJ` Q86Xg)>|"8 s=hRJ+f2`"*J(}OF"룺bb_/ZKhz&s{agwOW\ͪ[]^iUlPR0S!\L g۲Sg>a:N V zü#*+;r7d硝X:AQj.H\f coYtқw~6~߭nZg:C虾=]aP鸲Lgr⨂wWž:2_i!n' K,sHO'mqGJը{ᡖW"vu7RsTĢ)0S˰Z85@03t`,(J,hERҸrfA35ˑ(JѺF  !4RcjQ(LӅٙx !sWSFY/%܊w U6U1Y䵮&ޔd5Kw}8-26OVNDMO?ŻjPYXD䇩a4Q^5˷we)W0]Mˏ?_|8m5}qY/'FFf6[7_]6Z??UoF[oa;LT^cu*0MA,}p6vv&X 7VUPPJ*=XXE,."8X,%͆ q$KP{tGnL05!0r0]`L@0 d^§ z/+<ވ󭴄 F/::4hqlDA+pi.Q/8maR$0a +AIDVN ;6S&ZqA!Ecv%+p3*ETh!ݖ0>b d3)Ȱq۩eZh*ܸV5,o/fT+ 0F = 0 1̊ic\. S)KcQQCv ` &8 T ԁ ) |L 5;geAF޵q%ٿB+06XL2`03dl>=ʒ JN}OD=ZVw1'fߺUuι]];\ZH,lGl)1:d@Ns 1z'{[h3 'p dk|732b ٩ T쀀r,.* ֟8y/KU'I?"GeǼJ6QBUHe+s:(/4ęsv!\_8?e,0&e6ژlV1#BXwQG:G ]{śaQ[FVS21g2jC2ǠUˬ5z*w&yUCV":]N}UxUQȞ6E:`yA[ *ƪ)))(1GQ^ 5t1He Y&_mP`@܂Ri^17T3$HWC{6'U&|B+zg`\Ux1Y,1RːLMTPD+^# _  B'#d0p$-Vz=OаD *ໞ BDH,\ϲjVQ~.EKI$brQ V)^Hgx U&!5pfBnBj ,{I悈#@s(`#~ &aV+%* i']"y `%)4U`f#K3uZ(Ϥ  *썠cɤJ #0 GskcmU&Y󙧄B\QApVa5$g3昜*neq Z PIԸ,g A5FBDԄ(Hv 9BA$,ݠBO[i,,pWO +B[J;p!AqArc40J"X!9LUi0x* XAgaw$g!N䰌C8>Xs<+VB@$.)I' +`fnDa!p!%! g21@z}UTUR: X=X"F(mkq &|I\W!0(a*&$80=8VNecCո6+7/Z%K)V3petPUjV1H@ PӟrAlXpTlNCx$u$EXA8jDXyď=Q( UQc%Gj"j8V! Ȳ>}ːP8BἮR3&iB3qj n!F[-F%ӶoUI󁐩/%vXNSx\u@}N0- |`yItO҇MPdWTiR0 00ɠ< , _jPaA?#zPr ҈H1#"e+V胕EJP*:c8Dh)ЭLx$[ -9aHp# V@W4+JXm))w2x =N |tI^7ρK x1( 7Db6y.e}tPӡQV]k<=a4E}FX|,#m60Ze vp d : (N6G&a6Q$ ZNmJ'`N @A6e#s4pqFO^GpALhj ` e@Ao 2ZdpFX0aE^Y` E*C [щTIXsӳϊ0]⠚: a,XXO7T*8=XQX:F]! Jw afsȀ >Iz&,2cP56ZAO oVo!tZ.Dp:8I%yHY5X`4Nfc\ [zryl 0 |pݰ(t;ydM:39P.+dS.Jdr A(P`AH DYZ# RЃ=#6M҈ +Ip xrp494R&xyY0S! !BJQDF|R, ݜJA-\B!3AE v819EKz!aF1TՠYr@qQI<(0SLq'k͚ ? mC>蹙Z#@KrG :8Fe֋O?~ëUW*wEu W_sK&ؤs}Vm+o6 m!Z[{Gm֖s쨭:I[VXFG0+PϱP"gZo Ceo:)0n43J%_;q>B:+3dѻkd> M3>qDmxlwW^5ҏM=nyҍ^Z._\S۫'fS?iJ糧ۿşzԿ^6heX_}P`;~8~u2[Wa[\d6؍`ֺҰTӳKl,m߭y<I6.RΨ~ QBV`*В {x)BXM?mNjMvi)}9I7#u99Ή[q4|_FPȤnҳ˒Oҳ;ۇ켴݇:5;I)SHؐqiCv^D?& i;]f|IroG<ϯyݹD.ڛzSIw; vVDo ;F444FԏߪRRNKSͮ/fX*t \[Qf3dڌIΜ6 <P!e-E&c\b(_L8\yްQ0ws;YXFOt#`fL MIo:cσx=+Ѳ2 SrNOͣҫ1Zt G206Z屮cnsOyC O"u-KK]ˇ,]9>NS1 L; |K&3a31UxlŸ?fg sZ_3ɗKm$u3"_.5ul/ю/2)_w{ R U2e6YJIqƜV30a%'Ê㩏53}oJVu晄\}yavy}F(.H*oE^n8ٟί~"ٹ,ίO{/݆u@1~?/15*qknrQe7tZjkcחmesf(].NH'qLy{Th/nXqto?Dt뫃6w.plޢ[nCx|ʊsGMoy8`t뫃6w.pᆪSz0En< _:z:c(FB|a\ ldz;I5BM[o3$(L l:헿ZŮz{ʧ^ʴIRҔ/{f w: JRمʳd-S&}aO_!WO|LeLea!L9 #CgRoINȲZUS*lXdA-Cp}OY>5?vMF7qӯ=zvq.*wG5gsܲ.ϗ.ʇj]v}YD;?7ʤ5/>5Nv~}Շpbo 5i(=rshbQ'dx~եp =Ԥm>mpyN|WzYlCM\)Y!k9>I$R94\9EM4Лw;ih}뜞_ &mb`EcPSz@(y &m%iky>˧e|K<1 b"_j>*5i+Ft1,y#1'l{}G?hT瀛3[Z]3XKK;_5ܗngy̷zMɚb_Kuƻ}[S08vfb#^v<|;!GBmZҧwu@{-Qh⧯޻a9sjehj?)^vߛ] 붺NHWֲ۴}g|oi-]"7Kh؉[M.&SӒ $lsJwckļ?q '۳CXܮ'O&t::T򶷪\g=:IUD-!kGՖ-GU6̔Gգe#juQhI հ#*K36 Jꨪ~Uy Q%GU>r؟iRc&PlCc}g2-Md^kkGMDmc_d$;Xcieo \1]m3ZO iU4MWju,+juZFaΗCkl5򲗎6MZ_˷o 5w^Tr ٬IVR&1| |5w|fw8[7>|w>䵘o׿rYRy>Ӎ6ɳ)Ѓ]}rIO~')Tn>:fb o%mQN>a/*''#"lkeӮGVz<Vz3}Bg+]8o@Խ&'ׯNON݈8@LGh|rvF7n*28;smN^#w"?{cO<]Yg>{?Hׇp:v^N=FNGwߎTޠs;a lVn9nA18krNSU==]&"t+O"d6y=^M_Ëyl|yzqښyAa_zt@oN~A! D gH\FNotbb!v/>_߁9- =}7M춺 Z|&N@]gFX^|&=̈́z%|D4 Xپy=ۀ-0xo6,sýuOS{i7oH)cHeŌ}~\OCk=kZ [_ }e&Es9N=hz]v۩LGZʗW|f͝bl#o)kHL"Q\ i3l)}ZP\-dZϑx?l>h]φ8ΪKΓ!ʮl{*HF(Zc؂!9bAT )`ӄ5|^\Rן~iwjt8s'V #(шE3]u w+]1ywֿ>vw~%ߏ!58hۢyc!#?>}?K|m᳛OIzz|}?v+~?'LfF7; z׷ls")j#gBiyc/oo/Mr3%$]/ǫ+mo0|巫?ԩ77jzkӜ\1rFsDH-:r` ꏼITSڒ].-ViEB S,|vXa(ؔGV=6!^)w-Tb P63h 0ݬq5ql*IIi_#pQ'VZJ턺.wSl*664\uZQF]-cQ?617ڂRfo FkbxGs5GoS.?3!dmWDJQ]TY $Q O\m|k-{'&< ѧ`Ӓ>T*ҘcÅKKyn}o6g %":"ӓ 3 q.5fGFfgC+ʅ:w# *eqµb~J=bB pPBnc R˄rҨp̡=Ä|zobsW1·!(%j[l"+4 |Gmb||z5!cG#ID=%q_&bE"u!UNjBnT} n,xsWVǓDP{P AǂP՘.TQrY-J 9Z:( QpϭwNJOƙ*ngXx.fc%IDeO֠@1BK{PVE嚵'$OPͶegMGm2R[2Zr GKkd #GHz8n \MɹPGDJEl_hQd$Ǐ Q@) sr>  %ó :^^uT$E 8zA󈼡Ō&c&KrLޢ4:KݒfTYe ( L_!a'5%X+rg18M'27SG Xwi~iIP<8eJANrbAIȾ\<ͰR9ӡ>/[d]뙽P559{$ fÃ4H(v5!-j,g_HJF>@(5*2ȇ䉋#U8̨t2o,%ѲyF _qmn}VP>m@⭇K2Y)GwF!+;*.% 5j%2'1O7 cp9-N1dUsV\Aw el8;{-aOhC*D.h $;2rСkd0 lcED( URP]|  a *5 4HKX3bkim4`dHDF xd+.n| ,$@fZ&k*kA`?Am j !o"46g04n|,p~Vt16ɉ*S `kVHv..,iIPEj2yo R)F݆1~܌"H;@ߢrWA ba@P~ǘ5fpah(m0:XMh>Rxe7:u}# 7[uzsP&9xP` !zp>GlѳU%mB2 {hP`KSPfxꠀ5W73#٢dрvf c6,`f9|kA I(;7aLc, faw@6R 7րP$qOӤ͛|7M.bF/zZj3ot f`N;%^}h9 ZOF '%7|AQnC+v\\w~KbG'/7ec7[y:26^/΁9RO07o\gv~s*cvytOcT_$rK5PTݹ A=f"R 9ݱ+<{ӖKXC>,_\lOt쾹{tVLzY:e=_ϽU#Ee_O/iMd =/{(RA ]ӖR{h,l"#qۦ1N dˆ/dtFc_MG+WUS =tI(^%3ˆ{7N'& @X p:>}[9e,*LlT{cَ_ꬱc91يM1D'J0\o;'qe|:-6IjnĸȝjTܮw=zg,^Rgkw`#j"17el;՗[D6}[Eg;gf}su;M(lVn~;|{>Ԇ7(aTČ\Q/FSMHB>ݙO4{]_ov@~F @ > dqBՖ!ɏ!$^L}8:i}ҟ}P9]yVo9@mPn_&vEXm =eg; ɛ޵m,"iϞP<`7a'y, 4CEɤ4nZ5⭩ɬ!9W]]UU郛N?FtLN ڏB$! 2J!է PfN(.[s%@p*}Nb؅| h銴ΎYdq𨲅w_ LQW!ܢ{:٨>V&[Seh3Dg 72!jcc lkz|Gb`@%T(%ho@so y3ֆ\Tn`BjB& 0VIio\YwmuRσJ yM\(|!XJLđ: GjezâVnX̀ 79|5Dq(IEC05,)9~tr#7 F:L[*O>Z&c0Pӆq8T|ʆˮ'%`x4` 8C9uy848lsA2+F(ۈ(s$3Mnl#ye/Ũ F/&ۼo65w̡,"Mt.+9%D &_ iR>j*lJL{׿ 3fZ殫/zVYygV͹9 }QIt|xYŇqhMIDW6(EsO݀s7w5><7EGZywOWt?]/Fhk4s)5*[e,Hr>fz8;趨j(^*]CJ)s4&4kR)̟{ao۩@jv;¿Sv>|:L՜W sc;9h~n5tMY"901]{4Lf-7*=v^MG y{Up\m4byP$"F2IFҾ úή =\Iq3җ A5<&dVAr+ uP3r]ؾ?BN7Y*Jבּ'䯼H]' dij7tGp/X{V I՞NxP j_NJWqD`>i8sI'J^&=oM2! ڞYH"5 [ZO?0_ꃕTJءRbq|?󢫣=CK{.jҀ&CkcFBr7Rdif F)Nw)Pv-u|R}ܘScVp{ ((xK)-g_A7Fg*dMM(Y-qfض<сAv&9av&yn8V3A d6uqÿK=B5! <\ԫ#_gM`K%1SD3O(M;Y n7M*[-nO Ċyԗ5Ib􍴧s)1".Iz\DݳzUhѭg)+40!<k< D(T]56/YI Ȑذ`_>'r-%#ï; 'qm8YΡ߲‹Xa6ku To2iA.jnwGӥ8YmvqA[Ģ픹eYN2x˥P%jg!WNRl@3 ;ӽB\&L2ѹ‚yz ?}^. 0U%4HjXY'vw@ )6G^`eY976Q=xa,! k mr喱΃hƎ b *k~]÷> -,Sb(xrl F$KCܢ,Ju{Y2weԅ"1 RHs誟ƒѧ/ n~x`К0 le/c☿2 i-tA:6u/ދ,g]t]LD=7l:s3 t*嶕eenjg6dr8Buz O@j%c8~ߞ]o2O4 :DEK7kW;qP99#E=Ja1۟rC$&Y}BBAІbC.fn %/ iRh[alerYt1; JQ,_b `CIa}ET4!MK)Mbcf..Jo`q=vsMx~X dx$ {>m_S{ YlFE=#5H*3gNǟV0j>TnCtO'cF2H[{?y`,D GKյ S<+4pv]ZYA؁4,yWremo=~Ia!׭8u!6/M Q(N,Coߋ d)(1:Ӂk˨-:\+{d丗Ӳɴ@:Nm:p.ײW &񟿮RWK{Q^A?˧G:M9?wV;כ}^9y +@}#a|"_^wQ~|y3F1 g[p߁ooǣYf-u|3Vw?jMfMݺ(:Qh 8oJǐ$x@z[NL6@;c̋فǟx-֩0f[v'-iw)ǹG/[=O:O29սN #>*PNC_Odž%-6~:̱)1ZOG ?T\:c;Tڄ mddɿoCDoXOĿ,rn{w" EB?$|3 uYK ںrF;}/<7pO2̸CDE߭v;5A;TA3݉ X:Ź !I&QB}(=ٙgli6yʞfwng2kbu~ ? 5liSz9M' J/506J]HՈ< TѲYrZ-;2)̌'*( m dq^ 9Jq)eֺ¦OHhDTf6Dс.4Rekް)bI0_zfE M!h.|pD]Q3>4m"vLaZ~= fSuW YȾ+gծ7D>ZuV?W"!eA΄q l+JYk}bU77X$[*=VP)ך?4=PZib?rR>J 3~OPsީeJYEϩ|PD)8*%Hs[5s~??h8u_|9WRrA=V9z 2tïY6;2/?V - ,qGY옻z,<|؅E/YW*C 1t 긾ӝSBϱq.m8yo{Je<ͷbG̞z| ,?V*nS-싩7M_x#ժYV 4n4\}!&oO/̓5O޶a8b"6*U+]8\jtcxt!r`]Ep$Q@m0::Y]P{[; ÉսdքKXyuY^*˂x:wu WnֹZ3@8^y )G!|&=T׬>x&p<=ן څ=uFP <1ȡ R'EUmߌwZf mH O(u+ Qljmj Cw((;MR8R4_b>_O;ύO97grqeB'ˋ4\>BAև/UriDTZኤ i{QNxy%, (T !՗Qʻ?n~4?~dA[rZ̈B=mRjH˲Y[\ԧ"Qz죧~M@CΨ8lZPSaI[ h$Z' ̻jdxȪԛtewi DR [Xh7|2^q@RB8C@e`rH)O̎ip N}ڛ,'Cf 6RBqqxTo{.!e";X Fxu c㷞{ /FOߋ"6qn(QO4\h-4dBWd/]^@l^0!G8gvV:7P6Lfx?]y Qp~5){S"r-GJjp7g4ԃ@=C0 =r~yR0# =Y  C %!"CBBz}7m=c,h̃!>44 m3%KMoIjj?A P*4O!WMPK* \7Pώygl6@ >W Ӯ$#xt24dtD㒽N?bK;:!I '6TER x;\ C؁U?+O.́+oeLv N rsDՂ?Z yy 8"TT]ߧ7{ 5D}=Rf 68Mn1Ntr|F zu.0Ƌ5}!>0xRhl4GJ KvִQm-{f|},AHuI dhSQvcb8YuOyf[rbQGM#AཤOgx\ھIvvaz).:xL'Sf_wtrn;]T7ձk7U!kk7 k.wQ&*-+ ![꽿粡8z5ܪXvHsY沺(r:#6鈰]#qŊm]Z䊃HW&1XN'W缍&/G{UɄs Z.X#.ތ>˧;;iP~(g:~-cxC!Osy<‰N!Ӫ#ՕUjPɉ1?"Lc٢7ٱ KRQ,Y:Ϭ8 fC\iwc<=mרvЕFMҕQ2Oj c b<&Oviaols2f﮶#U԰b|삼LdD) 15!in(xI̧rl+{S?{;B gWS{!C+-u=vңDP3 %vٗ[Js WHգdrq6yTVA7լQĺJ_מL&d:@e-yxvQlEe DƌN3]J|\!g!r ;\9]|9>M! o޿ճA:>߹%>%Kad 7(>LW˥1bnH_]l-r5V9~`Cf18źOVCz3<+m<V$UʅϾp),)L(FOB3*bL$҄Ƅ1Fēc,+$0Z(}+Lp d%8X6cZ/mJmmƶ]ƹ_:lG-v F47J5zfaJ%?c.nKhefEyBI&rd"3Xp74͝0ׄqK3MSr(Ԉ*dMJiʈB0@\AsX"xl О1CEYi~3t;<$P%0MqB,O$I&4eņrs$$%1@N84@ `+PR)BE&5 LUlPl0*P爛v[:1Nh[<tģ `^ǎ+Vl[o{d̞k7D4c dtUvA{,tbm%VU.-.P$~K,So5_KQU)".qx܆K JI.pwyJ=)\ ӣ&*퐨eI@MeKZdOݴ M 37ò<)Ct! ˡeexƸ*+CMnA q/ɪu|*=8ʀvĤN>4[+Yw "5P|v2o{޹}|JٕLc̰c5jW=W< ^8Ѳ*`#G _~Y C0%!Q/q#-H3mtƙD41fvQR\ V8QsBϑ1FttpY":)iՂ kt /29ZDiFb +`2DVȤR2!h8RA_Q[ lYPce̟KTcɟv@5+ژ!Salx5">Y_u\ե2:8UTSG@%BF Ku۹ހrv Ju, B41(T%ek:T1C)xNٍJ}IUd𱞆!FdL瞿LXKhwwݿ(*q~Fwq$'K]Tt;JrtL>A|՟ \ lwMs0_NP㝵!Aα#cavHS.رqi|:)A#ړii:)pҟis%6h;ѿ)E\p3җzיс_)I^qS%kΰc|C\?pwGcJ#!1XZ2{ـnvk9۹ +*;Ԇӱ%O+&ә+qa7.ƅad(T3E,KsRp J\)Tۧ)C8P26mWW뢮ү&`~E߇8_ n5Gx31?f5*.Q h!EU8 HQ31\ )$ B-$Te\ ȀȐm! ";!e<2z&rQC#VjFЖRh @} r!T"^GB/'W M!kVPl2Z`R[97cptc@y0@qŋ;9qj&W^ckiԂޡ.@g8S?aKO/@by\4z)W5A1x1)AKjqY<7VqG-SW]!V͑N3bzX._6]np̹׫nO)xٰYG"_42E] P/_?d8nfH~[SJUL9*ݨOtS^*N s˱Nax[|TWMe!=ǎBq2a}ȇ<˜#4 WD$D+d0g9I3K$+c,gNJN1hS's^#wjK.JH8/7lZ":?BH!Gq`*(ZՀM$$,c)L*N!9bH`H3&rt*7 gHܳ^E*]CD|]Ӝ#i8F&'%U$E&U`Allc1"`SIWA_(]YLv.QZR@kÀԖH,gb/PqpMJi ])!Z*w& -s|+3CSADxa]1Aւva&k,9a؋<"HfltNK{[oR{v *$*vͭMMl}}ŵqՆȎ'@Gn`znh4ys^,O K0ϖW0G Wg/~O̜OzrD 7֨;UtRd.?ź Z6%M2E-e_"ͦ#! PZЊieb)HFGy7aqsXSf:iӂe҂#N޾<a5fӫ+:N:3_ZUKMҲUm.QTÏ{e0hkTwp4CDkcB\ M4_/;\IlT?VkZEcQEl<^q{]l[OvIK$ڴRYկg=|H.E5x0"3E/LLIx$\oe˶Ydky j=wim%hoUiCX3((dy2| ήT>Wg XS_dϔjDęƨ?EpqTOQ5J-CǙOA.&&߳:da<0if[y0EBFG+aR*Fm-[oIYYuu-#쐍ZϥGDc.XeT I θ  9t0)؈l;6 = CbJ+z|&6]S 02쉋0^!'&[a<։MR#nct׊&oHm\JJީa04ds<zVທdDbK;80YbtAJ @()) 2 /DC)SDz "ЄyI&B*,Eea_H/}GF^tP_anjJ EY5_\!$`J`kOK (K/>I0!2s)u>2\>'RL2x$.sR/' aΔ[I8^k[Y<ϙBzv[s ɨ.E57"€0`#%_4y?X^ӜahE=$&TϋRhPGa,u; `h#LwFˆ!!H$Gk|sY1ǭyGCb HSg& Pq=}bcBsZ5`4W*r3LAݜ+lHnBwn݄"k&xOTZt٭DVGeJ E{ v7cz0nt*>\;2$r*UVS9O7T5h\snFs 4 /Ds{h.Fx2">!o9s](1 Y{,9U*g-!Jm.!6V 4ZԲ8x͂Qlfl!jF[[^k׷%B$Adݧ3J{@$f%ixA;#9;:y%:;SVDU{<.bʏU,].k!ŁZHvC> 5 Ex&Z|^% h4UIvj5zt}㧩(=Me샢a5E*)=*ݾ=PTPI_xʠJr`va|<8#(V8`%bgEȓ[1ʃGC&E9jEB˶O$O-z&&*CWz%Ag7Bk"׷R f&=(oϟ1ī$ܻ/ƞ帷rt9my9n7]U\D6J@idBYG"SǼ,Yj8~O55Ǔ5.ORońջgy'͙?-9'˳Oޝ`JX<Уϡ?J)v6 ')?tWy27t)4u?~oRlIء\z}\8[~ 9A%(h2#sJꅘ-۾Տwr녘٩y\ 텮-GʛBW%yTmoTiSنogfy@#ds!6Pw4hk\YkthZ^OM[e׳蘾% _kKpHx8̂F\a& GSA<ip%ze*:B0һPC*U8l 3d:g3cQKI˹AK梎$84Jk̈́jg2|2D KGVpiMmqHySrAG0c#Rl.'z(i?^C=G|3ӧVZN)(Dk@FPJ¬&RHLİȁ% ѩҤCa] -"F`BL`Lع 5E(j98v>5Qy4*#֠( PiљS .G{)Q$N̅9_Qʕ3{3 BXT&&Ta4_Q,Q{8̊?ɋ} !Bz Ŵl^Goۜq :VkI2b_mf@u\ubw~ # ~B΂9_}ez<3s>YA5U/[cvu9xHKUtRUY8u28\|6i^ZL/F 1V7@-!deUF:}.އU(0?|qN@D06Sk7-V$bf SdJ߄9Hd&Z:SC{Zy'm&.X!Φ;d#.gp#јUF,pR dﮞz_f)Zߋp3aT{Gj)y7CPA ^EV49*iBaU0@5Ă. 8űuL#05e4x$,Z^鶜Lm9O7;e /)ܑ`*/}~Ѣ_wq(Rݥ,4otxHR>v˿ПwDk}h1k!^6:RL9".pe4ɨg6 q,.KxM'D4X9aBS߀.oF{ۗ1 X! Xx[K֛61Nv gb:<^Gwz ~JN_~zŷ`_<ċ)ADTT61Ag򏟿9_,qog-N:697wk35\^mkI8o\*t$U ; ?x[ɰwg&P:ѬӇq/l:N>4#FC_]L f&Bڭ:߯%hgwus'Wk0/8HaW*zCSt@*,G6$V֧JU1Yo^Ϫ+kފ{wa *M|,ZSKx$okګt+AN+>4mzKB@[9|,ZSwHI7AHRNm @Shҭ\|,ZSq;t+>}՘]3,l Ϥ"&@+Js7Di$TQ㈊B!nҖw& pNqOH}!}i;Qw/u')&&M`W&ͨ)Ûb>C&C9w &NZ!qv)r̘9X{/t8BeLKYs[IeTalˎ3遝(,\5^~$" c zc,5W&A Xq62EbxIreLbSIEZцPɃT[-ESt \Ƙ ;bX+x[(`x:N#;_pTSˆĺGwi|U+)A1Y:GgVT76NyLY=xHa(Jq:sn3eM=EnŀJ>8+gj6>Tt#. {n8a9H}z6*Q-͕݊n}pW΢E<;enh:dt֣VbK3jXe6:USwn{n8a9H}z\BŀJ>8+g""mzW-VM!x("#a"aR1Ŗbf#TD\MiVRZ%S  3 DD>02OcWSvj;`D][o#+^ٓy`Ydɜ&ٶNd٫d&A[ݖ--ɗ [&bX,DnΚG{A2LޟHoiW.I2%[ CnĈNmJj-XօrM){CwR11h}F= i3LhvBB^#Tё L8]qZ^=-aZʆ߽sE1WP^]9_uaC0+𿅇e~ր7崼luQ,*ȟ>PTM؅]˻{V: }TވaKJliy}n&12e$԰%QwnsۘlU4KcIXcjX"WI,hSNW\>JU\;&%"})nM<`խѸo[(ڧ-TC܅lp Q`DU}uw*`Q^R["tʜ4*dft9,63*9;Z3]F+rc+hpwc)KهV&5Z 嬆+wL)>5* ؒ{~Uݒ{U$Su; ?-n~XoTޅ#s2:L裷3P.U{g k­8Ͼ(76$Uw[gmLWޕ;װ>W[Co:a;X&&e.cZer̍p4k&+ S^n̢HYOb A5O?o;K !}qMyEr`z*4U{nJDC{Qu}7.< KD›6A^"&DȿYhŘ:Ĕ$APHW]zWԛ_/<[̔bTeg`oR@\({s7܍]EeTNZ(R=DΦ,^ q NYiJ8…@2z =aE E;[QBt&9ͮc'dlزfa A?o3]*b&Tn (v8ץ_WHCK*Ne?+倐xɤW6'f#X<ZydbA xz ,W߁pj˂k8}68?Z}ܣ7N ^~=F| S:&>1cD>J4LjyCS+G}7/!/۪ܵgpnT2w(bCdMǖj33'OI=[AZ. x KE @#9hV%.ÿRPehL9# ]hVQŐRQ$,h92/&(%`yI_6'+IRN 8nb?O_hP$1L,_#' ~qς;xU l|塃ю+b}h4 #Qn2ZG\E _5K_ǃЋ4t uk+ ՞NuwŅH4vɃoHe=DJ2/nԁ¯tByo]¾߱+Y&KwWΟ$ey?լb0HQL!E[mLa|Ņ(s BUKIX˨"M"Sއmڄ26J2M*uI.Đ8 9!衰b\T0ŗ/IjwY[LiXK&~Wafs~BB"6RV{Z䏅&9VZ,7dQi'tRLXI!1nr0VҬ1lMBd?ʆ(F5n=wIu@Q̩+zlR#b]z,y[ӛy ?:ʎWI[}k$t =WyىVtg9)8`AbYN\j()4'iWzۍz9abNB_r>O[1,j%KJ V~~ٜ/{exBs"U#j-c΂ur&_( NJV䈂tz \h eJ()uK3[mBW7MӖ7b'k_8kt@WW=`6+^RH){NV<`:-(ղ=%?+g|gb1p%NVoϙ" IdŊX%= caeBJC󇙻60rMv!9eXMq- I\+"E1RaaE!F j m춉Q2IbTL9ajbR' F*;>\j%ȟEI$d$J%m XSZ4|I$9Z i F@ + (C ^< M1ґXr"]X)B+˨S++E^`-NI?"QdXX;[pRBl^2DxDI_Q kJ(m$ 0/ý!dy '3Rx" 8d1ʽ$wX NqI0r. (;cXX.tJ tNX{Ę09/ "4L*&C&?D,M׳pAoZLyy_ު5[OԔ?x6 s! q)9}PӧDyw3_~ü.O4nHVW7~~7a6_a9u)8sMùoTёd4\΁/v;_&]IA śo# QL'rt o Tս f1'#Ef|jCǸw[d7_F rtcD c{dỎr "2(s c2ew.KgCȲFB8wB :OH"h$ZA9~D[ !AJ1&[݈c}{4ع ޅRAO]Ϯ ;] 1@ !l# jKԀyWa ǃƀt+S^Nf )w08ELKPH1BĈPru:4QHA>x*[ඉA*ּSp&F ZPRyA!q /%frv=8D}#r *B_f܁pkDͅl/x7ֽo ^<\ëF[|1Nb7^yCkOpŭNmz|9]?_;!!0W#$$օ;&sۡ<~B˙k{pF#r=Ⰰ+%a‘U5b5+UlKYoeXk[?DY[ g1*!υN5h;ypj?KR@K*?o;ypw)B4HFRs6z@jڙɋGRG)Mx/͹nl3!\!*y)rx-)O8~"c˼tZg  BSvM_+ǧ#5)C%P5iHrTη== ïhڣi 4rˍ=H}r⯳jDs@,ݶ\8vBNᷫMƭ2(sǹp皸psUBP-P&_t״ :dS@B:ӤAle Ϫ۶UN7pMpew@`qׁh{>HJOB%3<-Q+Jơܥ-uZgѻ[P =!`Ks11m/<[b0ʅ$\r&*ܾTHunU`~?ۉo3smxa jM/7|/a>#JY7mIb^PP7$%]5@^)Ksr b{IkNĹaC\m y|CvH>h ͶN#0AC<n@6uvd*BzSs8N[|\6<-|qmy<?Lp4sy&hNnT aw P.:l1ѽ>¯'{r2[8dCƽd7S{0~fi}○:V~vŦC62[!7M{40Y~;X,$UO jF?s6OcUHVSNY#Ks6O1֮阯c7_F~n$;&^+|8uJ h6oV$@F);l(QUY[~Y4|M"K?6yuqGm RrNޮ#f;ziAWs?W,W.iH*)0kt$ӵoϏDV$ky|!r\5!r=4T%嗒'at(W|g\ &\e$o$yrpa+ׯ.?tb/7C_&,J9(؍u:%ѶjmUT/-%|Y9{a _dKfy /J"7T2Wdt\KGB3y)G8&8^ #,1EGOĝY-_ĴȂ#"iEZ1$bT*͘SO%+VvXmCuPUϮD *"Ϩs %A;^kj*X^H3a19ܩPTOkZ]R/v Ki_ kg"2*e 1Ϸzf~FfGYot_M6ktuuV. \jg ǝbSdWR82_f꡵xEX.ﶊZߝ'GXu-{1}"ف /~}(toIхxT  $G+"hI>q>},\zةGUkTpձ\Gep:a?2TͦqwS7iK3M)cpSe1^X(;N)|: J54DQr9fc-iq_E%J5%+G5zTHEӞx'#,6=E*Εt|`BUs+3j)7+Q0; #K>-OAr{~ck {93Iݯst+3V)S .bUH|!M5P~^cyٵZ@*"Uw”!ď;YE<`~-.uj*;ʈF-ERrc6^[]F`J B"/9F+a^_WYqX-.",?fwɽ\`j=5[M>Ds7]&7 Ō?}}oVD0Ȼb dz`Mn~+0Նe}E|*ۣUT\%WY=KI8u/,XPf3*-; $c)Yt'@B4th DhP`tKQ*E s$_XZD |Հ+.I-^?1ݫҌ!gNwO뒅jR Ni>x4(k$N&DdIE{:_9øGl{Ͼ[h;I:BC#O3'=yLZS>_nI1s}) IϕxKiH8Ǝ Y.=k9eeeώkQE!ױOAsoC^ fr9RR*]jSV:',=D ?]fj+TYI E Lj:ZgFO*\UB3[X>+O\evstnqXus{YnW|9Wn\F@V5?x:)_oiyr!q\SH9 G2܄yzU¥%0hsϘ5 rnn6Y-Ꝿ 1EM~^mq ^<2{:4? 0h)i KwIX͚09Yq9䇻UXG65v %5u1l;I\O1\8ǫv5;hYe_Mo/~_M!J1<4!+A+AXzf( ~`aX{m=(Rۭ:'Lc~t/ҊhxzzkjAL;J"ѳ C; OkF{\[&KNfmh-aŎ>(: # 'o6ymazX,;˩Y~lCl:4L˳Y{P??\܋ o2|[Fκ!홑{4G[RPcG O;\qGEIڿnhA+($(-1#)4H㪄߼ۮcoY|xn Y.Ѩiph2ah4DըJNq*k՟Y ߋQu5>HӰ`O I#s|U[B40*o Bl[d; )sM:^n^sA{ tN ;jlflcqN/n-V}ܙ]zcDFðoç k?jt,*@9xm:p[焎[Cp*0R5'' t&nE[fv`tYZAjIh ^_t* !MXxPP. Cu[Ӭ6aAEXe`H*hWp:} #zZW)[w~ZY<-PiВa>Ek0zԹuxkH%zp|'w?x7>J~ح_by(LjsP⍦y1cd\k)˜;RDIx߀rA\e:΂:x*ԫa)?K8i.\4"IQTYRCq)A)z'#E$8i2TgI{j~r51N?9ur"6agJ{8_!e7fX"@.qa}7߷zHYcQ1f;etSgWUw=- .QCy};~ f 7m0s?kF/? ۯb/Ɏ`eGpKvճt4jtd0 ?uM,$1 |rq X}m0N7 hw$\ƒ= n2,ư @af„pfO( eASS%)" z Z^+l4k@7k+$'W˚'׮ӽyC+s8t\@gt#gVJ&u,UϳM,7\wWxzcͧŏ7e1 5* K jQ{ZЋ3;6κpX;'ۚL6M \x$cPX+ $S`l-A]CPI@x`I:Ym!bR*vQvo) K)vq rli#E$qF6K$lY Վ-5&޺#ɤ.^|M }s;rz,o>WA*+~es%ZܐNfK3ziڸ#+ClzrKcrD`lKDM^*t lKvaM j9v Pm@ *rOB|65O^ٛۧ-ːہց;6T /NmjYj w/E=QϚDu OϚ36仙Rf]o4sߺE7-Oԁޭ]|_Z.G^^E Ld_l>핓u]a 19^L[1ċ|Ͼ0uڍ#߻Lv]o֯_«7 ^Uoy\Y|iVk8LcHS1{A:v=凋ކY!G΢I<%д0c[*MX'uun݊4dg-2ڭ 9rM)y7jz;=jT Nh"'c$EL_7P!G΢)<8v#5R1h:c4UJ4nmpȑhOiuU&сni43ic+4UHwKmwK#g$Rs+ڍL"AFkwV;nMZ#g$kV?B*MX'uuSL}k#gD|ӯ+V!vmn۰$g~Iq^ifju#UU;?d:ea g?(} ;b ; \Jt%#NHQ]θ+i5Z`dl SXuzAzռߜ#W|S:-mz5H,nRb;G?Ƀ'>A JHŒ::oU;ۗkCExJ񃯜]n0A-1h)Lc.nطiܢ9rMMO~Ui~9T +X+S@SzM3-pȑhOqdY5U"zȧZXRìPB>Wȷ* JJGQMw $Y\y+o[ɐڭ0kJ3Wi* r{S\ї+ڔU%wv"}rW hnR;ׂE-F{)jr@)|Jkr9[{1kDIY/۔0ADUW'R87o).{U׿32}wK~ >5H F1gjifϼg !e ~'Ӈ o~/墌1j6{N8"gs<8V2?vd\8-5U>z+Rbd(I$Υ V~X6վ{’u4 GUj_Jϙ IgU*gz7&Z\+! Txo~ˎIr2I,qWa ӿG+R>y$[ӫ}(|4+8&W-rR;9 n|ㆣ0f;U`J tWp_dX^d<|55_\+{-(g8O]ᇓS,&2g 6B; ZhOjv\i5>{ĥ QiL#3aWj׶Np-s`[;%JY(Fw|X=I[e|8XoY-Q1C9*4k mo:pv4hYzB6~=*A: WOT ſ>_J1Q]~mne {E ?Qe#EP&.d:_y(q>vrv_l6-$yO-٭bhؙ,X/Vam 6.5/>ؐ"6.,YDix鴹w_F_ϫ=f"Iik!ցԇ>I}tn(~)`^EUՇGWNJrghh/o"< v:cӦ?W>|tKvD؀:@ScK~x^tzOoi:a8OcQjK4 ̛/]Ow,E0<⽋9})bL0HjZT@ⶱ(bHoZcTJSjM"Ǚ1٥[Lobn#geJoHgOof\yw?Hc/],ş%ސ;.ܛȯWd?ר/s=[HIpG:3,@v@ON_& ~~CkW^]bs ٻmMfϺ_pl`v M]L``Vv8Y;(s:t[zNhc ی^y2,q*,{ދX#($ XNt:NHk!'Lc7P  ~~݀kM՞vdŦȦȦȦ:kG N`^znv^1A:Y.4A@R5Bq(]rH׍:Վt`q U:GzSWJ+"^N0B %/MVG+hڇ-T ivL o+e\|0+l֧wbc=o1Rv=΋q̗_~ʌR9\U˿w1;})Må ` &CgeaZt ,m'SN1ԹAyQex8 Lj^qyXHأI]~udvIJ "p<­NrM ߳ /%Io? fbcq=>o>r+ԇS Os=P'cy%;%xN`"C$.3 AId[ofO a$j TB{u2#)^v ߸4~3p$Q8JȆ[݃'O@A>lDՂ7hM|輽.bEX,?mB3; &tm4̉5hXF6Ni{*6G"RYMC~gFytG0ZD^'mfXXw `zl8Cqϫt3m?mV>@6ƥ~Ԇ$dQ.{~5%eZt(~PrOႽ83OK3O>+8tMɭ;:|o:畃cm09=0ek1ŗUrjS2nTJe*%vȯ$(:|qDiaY piy$+D~zfz(RC:ExͶ"b1@D#uf,f / qC 'UT1eb_\-շy]o$&OC? kruvɔPt[&10I &Lz;P  ,+0 Ɣ18BKB.KTQ 4O`uNn~0gK 4kC1!Â",E j$3^nzM󤬼U GZuVdٶ&>bbv= bjR6:[ı숩+Xq$l_uӇ瀸g+!fIal_3YȠWqWe D{^"D%ՑNI1QýC=\ִNIKJ(qrFcy&j`b܌=4PNn$njPry듭RJ[&[uܘl}\3;= ,?h-P+JQs@YQ,Z9?e5@E MRC(!,C3|0jRԄ(^q|1V0[s_/"3~qY z*0%?s#cϪ[lo*څܾ|2@PW]R2'DI_+@i$?/ ` 5;{T %}.-GýV:^Rqn%7?-_,0c_t(Ie#+k{I'ڞԭ(@Hrg!T+9IsQ2#HŮuŮwbbcjP$Z'1:ZAPJd8msZ~r-˒<rmn-{]7]7?\I͔~0-/X5Pj@mM ؘK,M!Y˅A;DeA A*0IV**;nԍ;nLdc3Vp񓝷V/%NpL{>kLFhw¤=u)/Ib?0YChR#v Tk naKUXi%g XYC2z *aEEBC !L3'q,f.Τ~Oy*GV@╾T嗿Z\Qhg#jꣿ^l{jitF?AK|&cw*0f?$5fфѵ/cXJPBs93&m0n~:⩶- rvʞ=?D5 HЌD#S:)t!LJo5&get97B.KJ 7Cxxr$*0KlZH,Xgտ#WֿU.|'QRK⺠XƢDD;S54n_cFg yR~7#7EQA do 82o A|4o-܆hF7tэeEɸfdR7!$G(~$n u>V1}SQK& A#[5Y@( lEs9 $jOݶfۼQ 0RCRx3~3y.U8*f%a%$(P0k_pG E ֔nAwv,cfbOnd1c[ޒ4MZ:]bIɴ,IZow7/٣5Ш54fGᒉ\.#;(2_Z ^wv#X q>],"drރ'?v#*Sf=?+ F:$($tti >ݠ٥7%7JAom1XW*lo ,7bhQ t J)MJ $qցqQ7aA7,Ue6(l'N]r*jY˸ k1ϮuNy8p8&l fe݄<[7 L 1 LX6U^۝v׶CoĹV(R׏BD̽PBJQq-r+%<"tYТ!hrUVtDk!GQ7rZc-̒ȦEn&R,Duc{n~əyHQ]D>yy[{8W-yίi֭Ֆ=q#'1=D1c#iU)QH&};Ԕj1 <{&@6|E|iÁ'\AG+`VTM]*rb/o?.1fIMYѠ5N^yi>xah< nWȻ* _ߘ/$r귫ՑR ڛ7f]O8zא7?/~uv+;fζo=Öāy0[C|'sǟ.a6]Y]ې@_9_}ՎC6C0Wʮ7rv6MӁA0[U{bJmylsl+=Qۀm@E4K\-"0'nĈN7RtV甪b-{m y"%Sޞ?nWDv Ftr&<*#v˞hvBB)&1;L|rYoͅ]noػQbRX|[}z[7 wΙ<)ݚ9@&GH#jIQ/PM_]~ jR8PTvL@L۠V6 _)\$:/4u9Џ|h7Sz}rs<$$ @K"si)>#ISrwCyBRҧpH vp%A悔JZES)``Gd0>9%q{AJQWWn𫛏!+#0~_t< Yp_!0bC:C7vӊIdP)C3򱭥q!^ hE1֢!0Xwz] T(N%$eG~oFU W5Rơȕsf ZӪVX*A76Å%VT24`Lmh0rH֗D``¸f>{$U4F6bsdAVjVjIڴbxJ:[ژfViV\)@PZVXKkQŢH hM"=֑/%8H̎w#~N+K®m6(8rݪ&Lpd۽aRFv3gS"I鶲;7Tg(Y: GGKidJbчq6X)?8N ]6ʍd{g$sg}TE/%qB هt$,>!=+eS4BWbǹi4%J16"fڀQT7sUFam`A+4HC)k~D.`H"1< ѕQZ@4 wq+Uմ'+CA>k־Ȉ tvTz02k 5DPePZ9axcM׽0pak,ހp3!g$b3m>߭] 02#pB4W#8.tt/ atg1Nߕ{sQ]ն}F`Ώ=:m_~y~ -#CRqP(V-dgO r4 %' $y !C'bt+H*!-ĽGN{6-Ȏo?j`JAb乩&GMtN}ƥg@*PwBAK #aϞѻbqf6)ܦ򴾗I 0NEktE0jޜ(dzmmc6VԞFJ{ +.!߻a y"%S |ݸsnĈN7Rۄ~ (-׸/?A=R !!\DdwrMgѸ/hHj7SP^q_vM}H32(:3X˻eh?Wܿ@y 8vM͑9Y48]}x#S]RH3f4dd^U݉d4s`șH\43@b܎XŘ7t[d9N- HrE1l "XX8CAG>PJ(hvƣL. QثR7V ZXզBIKֶ1nXQm֬A͔TwydBO5L#Iܾd'q'n^(2)p@ fR#LS!):kErw}:,I\XY,oq~~}!|q}wc,OWmEyp{1l,: -֬Vt^/FXg4 ! R9A1XGr}},c&AcOl9bYnmGP_\;# sZz ; jh2,J |QœLH<ݢLaCF|a,ʻ%أ<0N9 ]Q, CT:K|$@@8GO؍ʀH(2Eؠ(B6^R;fBAt,!jN%=vL!4o=; q.;re$:tqcœL-R̊r,Yt}4@e,{ =`)Z-Ģ q@\V bvon0k8PCSSd#mU4#ݛM9j$%_X]?Ƭ7Є04ĸZpTv%4%fYGӕ4)K4r/95HqipL8N\ž+0 SlB>e:9ؚj!ԼVBVhTX!!LY~ƕSH9`VR)i?)W~=9 ]?z݋&cn3Rƫrc/v9Baf 1gtLջm Ä}cۿ}!k~L"Z{Rx#nJ\[jMz&WyEW{Я -Lu ceusYKy+5s7f 9 [Ѝ=:L_>kAҭE\d"G?k]WHhc,FC{B(>0g \K'wjqݵZ]jCxb3^y`ptx<4DJ<3"NG|{ 8< l߆H8qO@FWM71.x MdXmPP<>^Q8 g4SɥdKso$ɏc#$3s,&k<  S8B S ϣ)DЏPxI)OB))R'__J@K"{$h_~=p2l&SDd YlR5q/cZTuKlZ_ӿoF(ƳcF3R:*sdMFbEƙb#q6KUˢ+PzdBD6 c/$s2jPK3[5BjRַ죀$s?%!t\d q>1b$gZ#qŢ{ E|lnHv/{0=x~\&Y?춫_vInER%jɶݝmX!+9Ѭx *FR8h\S[R0D>ǖ3p,#o=Ia(D+?C$vg|52 $ɞӲ2rĜQ`%Mo"gvXjOKJ~DS2F̃Ap~T4"UG+q;{ Y7'xz8/$챘4*\5W'6>gs4^y Sq>I쓚>Iv~xuiZB Vf%@{ * eK€0:q֘ɏu6?4R9OQ6M.O>xrq+4}g͕7yRʳ4/VpnJvRNILֵ#3+,ٛͯ;7'TTS}RMITW􉗱ZbI䤼D @,$mffͶHA+v×-R=wJ,l%-ީesr͝ ?xKZkpHA86g1\pup;rgPu_RH2X+$`"3bIEO .Z B)Zj*BѤZOy;~4N] sSWhI"ԣe?{CQO}utw'.ڢ@ws"Z4۾Ui#hf&]7|ܯ/#YL|8߿9b7^v{~npg*0ӓР1MͼMVpnUӽ{/ÙzCŢGwhcqv݈נдQiΈMiD!\!)k4K"Ҳ5ĎԑEV! ݊Uk+C ѓ^8p;l.͵M=! h]3"Xk~LrNk*L$W$-k8)Qt4c`C9ӒC% -^+e{Ҙ 」ɒ*bGUZ#uzCbҦ`/,WWӟϯJڡeME"hŖ[?] gD7ށ93?]oMG']OF iS9lU-%  aӹތr eS?<D/z6KzlaKq4)('^h b˛Eo;P)Fu;kL}7/U,fI)/So4L @U j\`K*]F|^*V[XN{\6<^?μ />=di%mxƾ);y,mè{goۇ܅ ݻw!\≎82U# aC?ad-;Nmf,/ E_N010WjW瞤Q")A:u.nj%^8^VNv28|C,FXZ`TEʐ ׆8tuzuwX27KGS/GSnx\ %.coԶUv޷+W-;T .>ݞZ2#&ݖk^avw xz<FwZX9׼jW?|h6<㋐o{Ӳ6}bGu_G]c;ta6:1ckNl,EiuvbOtPsۨ]DW1.Ŷ:T0d޼qQՊ][Vܒ#^/gn{pL 7p+ UD;•H`o(U]@){kկ NUR ! mn5['@4hCt' +3MuൽX ~2p ~÷w p5njV}13○34sh|'QC jZT-wZ [u)c7H3Yl:RE^e受8!խ 1 4ꈪ8dC$F/XTI3k31cE 0bN=cM{/y -K@%JEzc|u12T{N,1 ,1iJ0ɣ0{ATDlSV%HcDbyf` `3X!q<)H9hU>uM)UC"׻?zCkʇuoL7W7|[*0LB` j-&\fH , cɽU4 Z-1G nY|8_ԝTPwRAIu'mP Y ZR6H~EΆ.삵4SI62 Tsx6A³_Ҏ٩8 ߷Th nS2 EFs1K%#H2AfYm0u07ҋg( Td(9Xh#?RQ JK VAγfD3Iȳ+sϜQ>UXjI렔H'wPJR}!XJA9˖iK%`+mO,\Z71R(/e%FQj&_Df1)ͼo">\0~ ->]2mgfyC_؜T83_Guk*O>\޼?s;h& z *̸P/rB(Jb,@ 9@%-"oh@}ԳS2i26>Rde1deD-9%$U 4gg+H>1SB80:'c1htN"tU*еia/&j KHI&p '-IE)R?(K6>#RG:EbfҲu*y KuZfIU/ A(>xvxEcLHbg콼Ef|9$4vCĊ `ǎXLY&QhЌzDhV4&bZvX$IvˋIO$u/bGޮKNz zɒװq{q?+;JUù+;&9T[29WNE<8"+'eA1+!aFd"7ձU<Ԓٗaunwӳ?O/Y<ғ^ЎRy q[Ae G6ҞjȫX~?̂Ge"m W>Lūk4ntwjy9y8 DZ5/<*;,gB.8?8Lu b 4M/n1?G4/[k~ ?k9,5!2>v@8t/'r1"Wp(DS.U++!@+|/7OG%*.i$2hkinFEї݉pHˎ=c3JEQj-ɚEEJd=@CM=_"Hd:b j 6pc8FdlZ"Kہ0TYx~#{f$'3NL旎6 fL^>̤4*\dGGF8_N7<܅R8marNnlia&L*ۨŹTj_Ct)QSp >m1la|lnfT d%NR5-+,PVF իKS֒օC{s؁J'K}Jn(EpԀ">\/" %jzZN^s~wUm?⻋̒(*J(p\VUS=-9U(~mnVyuyъX׳n.Cw~ڥ)P֜yshy~o 4@y7UU^ߣYov:TK 3^sjaº(-k!U+L0/shW[smطW|_:&A[}6grZW(ӢTP 5ΐ @Ti! %BPKm68:"j@ *' 8eo}V=/eVQN1 B#-W\klɓamf(<;7p8{?*zj2/_\yݰA4i6uϗ׳狗3~g I`-edUKlv}BQ~㟋~((H-МE.O@iZa44AW_jFsMU9L0.'zW &Z7=yĭ:kvA*>a7 u!?om|cF?U:1LVՍEgK3pOirA(y?l\ beYL_Kf]˓i 7j]]/H\Yʏ!W+xHW 'Lv"Z01H!#ٽ ϸ 9 F }wMkYT/ >ʞU]{.]RK![dJ!B :<}eh-E@R fyDe 3WQDL]bwuc7"\m%t^I/}F)Cd5crxcѧ6rH~(tؿoܧ^tGݖ4xAXQ9 ^r|^TqqQu\Z_JXIhx =ǐA&J rsįá)=Dpq:d[~OXQI g*ɅF`XW?*oUtPʘ#:rQ=V_Bzz/}fJЈLzKq:%^9P:t|]ҸZ V4k);IׇB:}3Wݵ/f 냟xpTr7JlAA"۝m/nAdӓ G!HA9Yז<~xZ憀w|mʫOxlޚ(jx$ '8} Yc(qHAjXL(+./eƞ/_Al*K.FIEQqb3FuS*C]В.׺@xf)uPx:KpZYW P<@52R %X ӣ=2~%Q5'p #T]xr` d3$Z8ή%RnjUG@RR$4 0 'w.W%XyX(|-HPt;V{if 8ƹQVZYsd(V|qS#MeZl7']F1:ͯj8cR#M1 ?:pp2T$?]yq\A28H H)!Y#/ 6Z&XJ-Cx|3& [2+G|^0&_ѬQk!d! ̈́" f½uKRHfв$8Iʒ;eQJr` C4Y3a -$gp1X)p5ڂԮpaʳH`Y6uֈTꜯ)}Y<0f3c[?{E8qib=xK _; Tfo#lTNi=rgr]T9)Д84 EeE%zWc{[ۇo3E6mOU JCɢlٵ?~x_ė7Q!-w6\6ܹIFB-rvSX"[axmJ :eqa$Uhjyh'jQ sS{bOS-FL6CYB:8{IkUƜGΘ#U,k؆CTK;*Ae -H:F*,(˫7 X[țg(]h-Vᦼ@ג n88mϱvDTK}d:>.ҥseMOdPȦ~0OtW (׶p&`!8#Y em8Yjh.shюCH̖3B<sK({څ"5z4jPR 1^/Кfu{#LṷJ["c"թ2mMpfvzzdڔBN)' <T74mb=Ɔnv.h7kPri1).VՐ1.q\`S3w'KP肬Bh=!8#<`,U. LU&O1,dUτi.\V.:wE}.Ip5LY@ ºXEB#ѿn>Z J,tUwa]6vrjq_[5^mN/#p5\*j/t*Q) W~.4yۈm _;44plmZhw I.o)"_yP2B wA(AmwǷ,MH*_RAh+6 UW47Yoq(H*sܷ˩e4y+_K-M2_{$8v䉨fPŃ!٤!Yc8y%B>z.u2VԎQCxC2}eclih.Sh2lQX?d1_"@WQY%P;E-fLxAr9ɕa[mjO*2 Eqh{ߒ ~L߬ ǧֲZ tR$7/ t N50:ct)ыENgF!'D=#M`z,lLRMLwbT$c5Nh zɸj~@?oO 'lLR6I?Ksb)5Mȇm w-אikaO7# )Ji,RJ.O蕱&q4`VY%t^SYw'&kI浢(No خxUq]fӦwE EfDnE5L{8~-d|3t\˞\f|d脑kh0Hn0UnD5aWU8{z-.: 4ծ&3UOmA| M 8}?P{퍌bG)vJ-c\;W3(Z+EZ$x rϡ .5Zs7l6Oq WIj=n,7:DSitӣujEapFshMB?{۶?݅!l׍6X7 M@3֍,$5J)q(QH`9sy̌ (ܱ֬kJ-_y /UռzBFbA +fU+)~\Wk~=ʩRIm>pÍAF+V@_ eh<4SMgh.=)ئFft*mnFԧĐM)\{>Mk [P_zu-/FW5Un]G!. .S%1\ R1<](Viwr?ΰMpV䦂aTiTW5Q8w-wwUխC52K!Q i8@(%N)#TS}ո.D/ȦnIsIuif2l&ʠ4MC6hEHK*(16`A"6ɰ$KXlM'r%M 7}Xۛ9%j'Mp e?;1_IR6ZZI&eLϽswFjLo!↛=MS>*mD%xg!1eDd=Sܢhl,/>e |EoR@ʲܻՁ?LͥI5Oe9Y4xjk kܧ`jc,rc)% /-R&!ı ;t6(Їb GO[i+ۻ5 f7gL[ 0A3,9}.ɚ`u[|J$Oro^*{rȒ}׈ . 90˜olC%ȣokǞ%q7Z4FM$\oC\Jֳ:0vf:4VuʃRJC7R㑯]R'.gۋ~gLq K~v5 }(b Nzo{? a0۰{*z}.^nu #C$: 0su,)mz0:feL B2P+0c­A,T9] +̓ l}^$m*qr+l]?O% %F2RiyF)rUEJFEwJ/4xrL D3~>{f9ƥKxQnfw}3pLkgC:F85+HRk)"xߑ:PL Bzi漍*fΟաIYM}T鸥#fSۻ}B5qafG-_!Cθ\Jrɕ*L<ܨ7̀$DQYMQS&vcPB3V*/ɀDQsw*m+hC u;G39wX& LNBIk7t⡥I4Ys?0XTV (z -#wqERH62eh+LCU Ւא6},A8TIHsC I,GCQ,Bq$[@Y!T ܢ^+ӈQZQYldcA83{ܑ3-Ǭ]뛄ģs*,vgse-(d1@&e1ֆi#WhDhB8F)is^b^Zb^b5"ËS{JfN ]`WͶ`J"M`$A9ck2AaGL\@,$(#x! KAZ#IBHIS]7SJ" ~A-tCfpu<[zxfKYp xy@,{b)c!$c0kϸV3* |L8x5Q` h@$C;Ymtm|u%ō`9#M68Ie ^Ln$'7WZĘжʔ:dԶK.Z<&hX(aLJ-4J!Ӑ( mpb>)Bl@Т'l8S ]^'pe@V@:M0Rj\i)l#88͘jKm?»^ݸ{Iz!c2IV†̂.ˌ#0?tRG~=C{мQw`8@SoN\~\Y3}l9w;9~Lo~w~ {/w~_˫W>MWhw}> ӱ2ο?{b~7m7\n#?Y\ wjG/>oSq%J"I]_70O‹/5Cb3.&٘7SL?M#Ep<]P:, Z:h0ՠ;39~aF Eߌ.G綸k̼+E~x~EjyީprovUKPFe}r||st}q)ӡg1Sۿ Ǔ;R 룂bŠ7nOG)a]̷A5'Wg 8??!wt^ ha5 ~)}2,? Bco|S?mm_^'{1OAOX߲NC柺/!|4t&b#Ɖ^]rǃy ~j-wnɻcio78;`D^), 44*;ӟLoGr#W 5qMX7<=boٝGٙv/?>XB/}. Zw״33VԸMϦeʡqJ&|0:u!h\doPWgHoU{]Ies;{p Klu@nX@u (R᜺c@ɽk;38xp*zop)}u7[͙ۋ*z qzNޟ_ G˔JeqүS(&B71ifY3 Ĭ4?[լPxk1To/bUF ϲ4q!3Yn5q$Y(s,"f k'fZU`P4 :׈#28=9 qm|uHǓeͭ!X$s8GgQY:TFP)@A+Ip" "ܲH .QY:6TK:6za'KsPV%y  En/^1tEi"Eٻ8n%W }.dX,ȇ \f$d iF Iq,oq$9hIYQifzêsجR(êsbm(bC N\o7,Zw}C+;㇯ւ?ǟ^7b6R-i{lW>MZ..?B*s(/dau.zAbF%+aӷ߅~YRPRPSPpm2VߝCt=T{2; I;|TO,I#'­hhhh65Ǔ[L~ki#jbl>>N"X^1݀G?U{PL#Ei]{L'!1h6PAX>"1D+4UC*{Kq5ZGx5.ƥָp8Pn=]8CWﺕmC??IpgW51lLP_d7iP}=GW¥7⼬~7L_,zLs~;[mY[ۭߜovzHn" >WbGWDRcEdj>훛4~srz^.#Xg ssvϷo~sW=t??Mɒڽ}l{=h\0[oިҽ't7[]OmVEwtǡ-o&G'k>b uo][g?tDv]$G5q> { #3щ3cƚzXSk6uă}Y?E2^~cw,"sVPz _5\6.̈ڕBIi5. r&F v g8dYz y*cT>fQ>cxzi `V9=[w<<}""+'xcҨ›H8Dݢ\dn\wuk守狷*~JD\%޻fzzZ$#PB0MSjRB\x>Y4ㆣݽ[82}LwO΅_ޛ>)FFǻ;^7:w`>ۿsr<}}>:=Lz( ^@E魁 ݻh{| >)OO)| Tb[#=>/4}:ۥ 8ՠKydІMU~v dkmk ^ʿϪ gǬ,(oiy)xh^N|Z =hazNߕlWddkV-5yyb( ?ĕRʾM te߁cW58 +s4_;}UK*42}x/8%Z]XRO vOpWʲV= ˓gu(JM5QH/'=~\C*?*pe!3^|ܟeLM2)9PO1r N&C!­рŊ*5\W&w=Yv,v/I^]j]"mi{&;ȮIN]k(3D'LCٶqeT"= ] V/p;t=}zw\ p"jۿ@YX%ۣN+<+G%"sdnNgO[QN bU6Yf i{S[B+꼰s.CB:u\H4{‡ٗ6o"lw-E,Z򎑝p'Q95#lAW1=rLt2wm}ޔ5i}J!+'fAPOSe^k]'|m M̌ r=9=aj\Zɠ(h5UpbEѴ:o1{]LhNr2q~Č1AsN̘6 JJ-XS1y^bZ8䇼l\hRp cb]dS)E ;ܱn6ƀRА7JF?7-K̏SB/놨qMԾV((ڨĩymd)D\@h)hKӬ\[O\L4=VcʱR.QT.z'+)A Q7^(W-[8q,t!'2%+B/'2p"pJ'A7w&1?(YgQmxxm&Es>jhЁt6Fqhe!^qd&TJ$[[LJȢk-Ii6GN\Pf[Z!c,Zzˆ9rae$^=Yۜ]vvR#,-8.1bn@[ !f i\ *  Z;D>V~|ҡ84&M :wj"%nóM4(2&60Ǒ0U(Vi".69:1b`Cn~Č3a9kfQc:mW>FCKhKA#RBKj/2) =pl[VGT)]9v{'Pv@mC#X"dpn@!h-ߊ4Jh0,=rXĨUm6ERAe"{`F+1kOe@MnNߞ=`nīqd[Jf\"7AhZAe nJ`ɷLYV [Wtf(U%)m@\74 7 o~܌q@o:LwuFZ$|:}M0jl R7\$P8MI[Z / &JA}3Y/enDߌQyuC3yKn>O7~uw2+{r5 ~k%Xϯ~ pPZIWk|+P_|7힃%W'-w}FɺZf;8cɫpuӫw]+_Y6~0ɫisVXJ6~+VN.Vsxj {_μ%<T A?ZqYJs/?&Er" S=LSQ7Ω[Kbִ$Ͷ /%c>9+4-'z(/D?Og 1ʅ=a\pOSx؁% %IbۧnD\[waa_+Fp5#l[W4LdT˹n˦BףHLe9Z[l'Fp3F̒qNp^!&7 UN47XhD |P՞(a,=܋/d伈NǨR1A2z؞>hڎV6g8d%рXLɃ*;u *)q.liBqXF7VSBg'2( 7 b7]X ЗpZ2NŕZE6C72컸qJuQ4Zф d# E.LJaEyQA*cspFwaRԡERHtLwSpO-,tiMQp?t˰vQ\Ϡ1A$`*K+^˅%um [p 劽;ጁЪUG!K1:($YLRݰwCUڲzIeOnS v񧉘*~Tbx*ڎ$I1xF1TuY~21A)QX3WP TSTB3 ̴q۟륬J N5~Hٞ` {KL`_!`DD1J 4jY %zJ:ܸI!*SݰnػT)Ŗ6=^5jH'V?4mJyܺB8i`_5UI74yP&<(PC*zuHPM)`H!42 {_k ETSi+?T2q, 2OiWz>-esoհ]dg[T$JaTaXlToW_=|4fU,U_%Ĉ(h--4EAm6crfϱ *dyX  +M Η㧂^!UEٛt1wxUm'ǐT].iB +vuWVlU)"a^lS*dfp"?0ˢbN`ےG$NO4\Ձqͅr˱aw->-VԤIҷzZ,ͲDgLʾ^&5i֋%n?;+;$Ų~Jѕnqя!|cU兪<+˞e[X4daMzj:ԭj35y+CHI0%z)C"[QN>܁S.meVqNI2ptZULXJղ%.1,_O iS}o 2ϻs Rێ+?}_ċكg(2w?v D~tJ58[t ߸)1\cvnTB'fV$;I : />4BI*CeD+~@Cc0^VdY%0 SwRC5 JrNNj+B YUAƺ#a/H:+6PIȒQB/$tl'oZ[f^dI:Ia'N2Q}g''L 饂ANWDӋ%jҋTTYU5ww= Jd z+wexFNkwTx%SIJQ (yj\Uz^2h*-j.60d"#R:JȮa,.Vƿ>uÉd % UTl=J-`(Q{1Hh9c &" E;m>+c/W"OKSGƱnM0[ p f8*,æ.ټa:K?anz9-`')-S.. I$,AtJ p-86\ 6A93Y}8Ih'=eilU\]!4 TUI'iNrONEuI3K`9^c;Ԩ%4 }(sawq:T/Ha$ aj~|qF |+};-SJZ({ $SRػLt.SJJGlz8n1`bPi OE@3@UDĕ/]20*UM篑%Y0AU$MDT9ik2`H|"DT,BJNJ_/BS4KHSdPGz*JzE0ޟ.{ Y=Vt`WS+ Lg}RqA=ԊWxFWθF;+dt V8L}8iΥ_0Y9]_z}F)ƽkfU g3x]O$ c%+NJL _ځc'?d[O~M;K[7l✫>Yjuy Ө}Q+(.ؤ6Uq)3Ωe]TV7Z '1DǙqpg.2) xi dO8OdRG`y3* FtV砨sWUpΧj.Uu#hoLb2\ lx#&2^%}>F_@Z~T?^y&< ,|~Ox{t는~y:^tƸ3VF] `RՋ(p,;l)ľ=We굮&~UTYA-ך ccWU_wQꦢθ1 p24N{8!ԏgUZ7jk ~y_#~ Gw-mxsX*Q녕1h7]a^(k`OQkU5ZkN&FGB\#k* d?Arlۢ/;K7Q1ef>u,%м{0hM> G6[J4ҞImx 7@[вn ^/7V]KF/ ߗsІ0jA]n.vsSz)];^ Z7noF^׷ gZnpάx,mÜKX[lWϬ demʳ%M+1)*P]W4qv~Pe 'h+ZNnЀ躉v>s0[lŸIwRQvSXl|&؉㦨% ୕\þ)s.r7 f WCq1 oit%y+L;[jRׯ/~aINS}\|Yeǩ=E"& VNC[M% y y Ίx9`vu!SSNMƣvg(Z`F˸skwKKTA\;LJU H;'&IG)|_J#ۧ&-"XJ, ;+@(lsjUxh7~{)j~p+`i4K͊sWlz};5'N[D3K|hr]h;[Ƌꭵ+v jI!1RL*aQ]nC"w*X.%u} A#jM@@pCq3VB^LZ k 04 %Р [CPcp-ĵ~4elBK)E{tw1eb{@Ïwf74R` D]ld\Z<'F@`s>ogs εx}lWEw>e jyZ؀w Ŭ6YYkww]{?ks4h7\3UgMrRITogϷ *-=Cry T ?QM"{}!BW5ͩ0"r#JW[be)l`F!&Yo_K;7I\pp:.([f~LuE ؿ7{+jDžhxW)uq[ћhdҙ[Is6'1ֻ1?Uշr_rhԿOٗo~LH&L]R% v~ƋpKrܢ7GG-^ u;5m0rn\9& >] xHat.p ]F|sJ9h \F`i=}L aʷ$l{kHu|;Δ'[w\[roߪ6 ~8z놠zl.k 0O-wӖdv`7G4))eЈ?˒$i|Lv`)W O'0Ĥ|c: XȌkݟQ!ye.LN̓~ۈd oDov>\fCO8h4}+t֍!T$H`8S>;=~ė'QFr(\=a+*~c2"u !Y4Cԫ%kOv}Gx.Z".Nd.!eW^YKER 5/G/{k& pp(9}n;m'#mD MV?Y"ByX<&s V*(`7 9 Z)mG$ .._wyޚ.^~Hey-,}[ 9x,I=Zcdǹ%[;x2ϾN>eք[Y>+JKOr cKM)DnT\G~X^GF[߼}cE%f~.NΊq4_#.axDZ߯4I:b$Y|ToWJ`-ݦ[Gp**Z>u Bd_*0Gv:yw?Vn[``l1͋jUTSZu7޷K,?/Ǭxu{S83oݕi$qHǒE 214:1?s[;|r:=O.Y]H-I{fPXY `H",:MDecd)>c_䢱#Rf|W_^Ĝ4<)//^)8OQG(BMƳ=́$ xAp?ݺXƕ -%o~VwS/j<~oǦnb*)ν''%RP[0ID&~ mt\mb }q{`R("~M(ɖ,TN?mOn3*obzi#86~9S" (4]STfq`P"6_YgF[Ol2Y6u'z7љ[=Zi4N aU9YHړ$(:a7p i dZi?E暧VPYn㜛pXhTba\.~h%: z:`<3 < 8%_L!ڤL2iC-!Ѿ]@FIs+u)zߟìqN,JJ"N&ng<GoQ,[S]Eml=)&@ CY_dPwf.ξ M;sE?MK׾޴TZWCR pL !׸!Ʌ*RN\s# J<~udT6]kYJYl~A)UZzzLE $LQ0H3yS<{cUx>c_xͭJ5+^Udɋs(*|UEY?d0+ Jր]2 Fϊ9ݑa!•ȠJ\~dP+]#d&>:PALeAsM_j &A4r(p y5A T_d_NkJw`D[a*TKvjwTm@` "&H8z+zyA7+t[+6K`N9[b׽iVKRFصY!zQeO=UȎ{Dez__0'%Ȯ'1vfH$υ?MD$K8=vaA [l^a[~ Olop>v0OYfZ\|ZZzG/`])*~dt92jd ρ_8FokU=:x`\Rޣvz!2ih J )BR2 0 `-ɘJO`,a\|>sMN!yMm߹2yW߾<><_ L%Me&>jNhs_\+d`DթG#y!?^PUvQPFZ*ti)acMɎbS >+?JQu~['e֧y"4%!X2M!IDA 4II*!ML!~Ji@~^xl>I8+.Ey@XSrez3$dItIAϮxNMٌ4OE-+{bא!2U kYϥ~VWCJܳٔLϦ,cIU& hȇՕD%\ JӝDɰ$nnvo}^p&(r~ZUr%RD$G[+(X 2iP2j}?o6uZBXyqG m[K[=G$ȋϗ0TדH2Oj'Dz^Yը9RVZyKO\6իB߼6з~PA8StK?xr@a= .o5mθ]jRy_507ҠT@|j`M9Тs)-V;%R0`.˅ҭFp%BcgU_$= Ȏ3 DՙܦAY; <](00+_7p;.;5T.$ޑr]kfcEJW;?Jo:>ɞ@ kH1ͻFN+kۍ$P)5\6F"ߗ:zq-(PsXH\C`nDHun$܊|$J }&!3`Q":t1}X-)uE{L癉8JƘe5DZjԳ^z9T:%Åh(.*בB IЄk}piT"0?0A,::CX%9!/ 3L))/RoCbz0 Nzy%߶8>GHa-ī2r>ӋñŃmwLO\pBEs%v=[G&!Fe*8`q,)v,OGLԭYȂEɏehaxP w-%ʎqx0]?ZA:;V \O$^Y8d4Kԍ͗3x+X==FgEi㑪FxhZ:TnƏ?; 6 |j8y8fb*Tec8"o?."kjsf5^7PྸǢw0)z&u*f9NˏV/,;.G9R"?HOH#a̢'(\%yL0nH2B$BfhQ L'D,>υ~x7XՑOf:_W /Vv9IV̖?9'YiүS+NM_#ԜZ!S`VD=Bɮ &x ֻgH$]/!~[5l @bm=k-@V7^bu}%y98XJ$L$8 )j/ydo "SJJ? jԄyge5vݥEUM5WY -l~TL %%<خJ[o#٠FR%-ۚ,qzJ x.ɳ,H .Sۙ]Xj}f6eXN%q]IiYXFQb29SP ៙)2%zAKzPikoD{s ޤym׵ll/nY*y|6ɉ9:O>h)ڗxrfyꥦ'+-yhzQ-D΀1G]uOJ?T6΄D iJ;# Qx}M3^m1tIdqSDz,xQ۷u; kbv1}hejޭ>3bτ\l׳YݵX:gɳ9{! .4L)w)Cox-Tn``Uǀ/})Gl<s AHתq;jYoxA: Ֆ#r1_V3|uAƁFaYcΚ~A=t:ׂnVU)7gF7jbZ3\df4Mk5E%C'+֑2V Jy`KOI "ޱ߮2k!ty$l TNGe~Ȅ1]ٽ6[j|ϴJ&j W*zi{k?8F4ʥ#'4SŌRaz%3Qzhm>U 0Mh`oDjǔYX"klk@WMR@&1_/2h$OF|,Hg0ɲPUvD=Qs\jY;M3&T-p]-]MbWc d AZ,8d=|_" Pi`g#dq x2 =>~:$ \ϩ97t \# ,!Q'*X)|ȩy[b$kOJzjxw!Eu>AgcqzmE؊,Jgn|M{1jI-P_ a]#MUU:8.Y='^_9#'_?,Z!t$ /#n'hև^D NS+G8⌇*g5'WpiN+kG6\]ӣ !O`?@\:G<`h__)CxzK"[ [ϰ C u2 z%"\~ yyQ2 xSFXXˣt^0ׅb"̚o^MOMHKF{{H lzMmGFpmA$])b4YIjzY%UpR[,-T`3i*- z+PK!M^d-lH r)B:Ltl(qz&w[vf:sFU2;RmZL)lVZ)Hr%R(e!enQ '+9@cZKr%l^*t_kmEg;@oۙ[4hgXLE~rf=Dv#qRh%9$CCz˭JB$68ƳXUC@V J u:%\-9ܐǃFjcjm|R gT]ox0,,Gkx82rs2._WܘCFmZnǚ܎9l3KtcljSkVeuO|z׶9jZ`w&U*?F{4 ض[=r3|gi.^^;rXC\ wYWkǢ-iY^I^|{k_;:*H"۱q2A _p˭}[`C5tS~cS3UwQ+-[)nHwT4wd~*613uqጀuOZ c9!DX(uZR#>I.N5KVsPKgۏx6nPc}Z QetݜC(N,Q#6p`Ͻ{:N,5c\懍{@r代t5H<IM:Wi68v:N L'X#fBTn3'olKc\#кDZ+Uq:뗿КWgG2gbW>|ՋTqP8(v\UИ +iya4Usњ)Bh K1W,ؙ~w/.zf'7m/SXFyu݈^ߧ_#![a 7A~2 z9h#H")7ٚ/Ø0{HIcUJ(ey\8p\'u{A&\`YgZBs+!ZS%b͜Q?Uw4n'?ZbXʧs;%}e?GTK$7`~nE[P-ou>=+z- X  MRc/WLo[#*teGK$Xa7+y$ݒH)|312u0L1_cmc7ߴu S(o.jq4.YqQZ=:v_@K8  to4Cr+EOo]pO]nghGS}?tb Ϡ3 5?0;2I{Zd4&p !}Yϵ.D2~DŽ6-}'`dl\~r{^4܍OJ|ЃRNqoxF]?<37 +9<(WS3:f F)-J_}|KVy(:76:[~0-E6j ; οU[WK[$hK~:e{||n'(_gݟFB{uzYH'$FJ ѿ0yd4yUeŰTNϫE[1%7~M7wE#3w'Iq{r4bmAw_~:hyv.IfZ"{KѮhZu]?kb6-v8k);]w0-#^Y34A]JS./:\eKIW'Vpd+$8w7 eSwz8*i@`\GnfZo(]@(U6.w5]8n&ɺ8NIYaԺ׳ׯSo.-x:|y3 ˺1BHpʫ oqB&S¥_B~xdznw"9R%?WdX$m!}Oj`ʈ2굯 vo1N'G0\8Q5r+*8Qr&rh)C@OU5ɳo+*21Je`h{)7H!nTL(kkX̌q)?֊VAA?d%!n@Å, n+]kA) Acee~ַpf??f!MBW|g8\/B'g! 皯 WDYS^Iëj«޿yx/q8GC, >( /Zp@Rm\- zaB%ta0,yx i};зC};з! p EqP(KAM3j4rV;4V9&d<rVv EH)pJ j|9td,ElL) ryՏpd6?=tc{8Iu WGw< pT(N]rr P2H0b@ћWG?N^|4 A<5I.nӐ$N4;ӉeE`x0W3)og{ÓY{ՑyGWp]tII,J;Bꎀ3.-_A U ޚʎG%CQSsK.]-apODK[[O;WF$?;zg 8Ž(ʋ rºxY(EhZ\unZ4S=l62Ǘx?_}$ߙ;Fyޖ׋rvhu`zY'm9XHYbowmhǔq"9sF1 4VRo0!g/ElRxVe 8o好XG U6ّ0؎LB2-0`+ @.S `Vt|pA0+#3 &BwRU>-َLCd: [N8_#|ߑ_92 )Z㏟NsQ!+ĩu1Q&q-C*'ȇA9\?z?/K_G Wxܑ (ߑ$?(&w ^?K }I´K`B x<$ \&U}81:1_9ӗ*MEZ(+D_I/ 6FTa$cx+D=Bї2$/y~A6@WIDσB؎x~|(f_@σL:X~pk cy} N"LCZQ~`,3V p O;c[?n>w, N;B$B0e|:3|@8*bX{/ {'àJoigN_CA2, 鹞-mz?_* bߢODމb H.g-p?~2^nU>mv9]mR6)i3!u[IM0U+hEV$HJ9Rxu W$1KI$ %Xa~˫7uo?*5K h29oƥ4~x i^j-%2R[($)XV1l(So]_$zʪ%ߜDNhkKLFy}16/Q,Fr&:|,ԍ GA?1Ȩݩt<*7xа7ݵ#qcp>VxXrdbckI^;.ե f"x_ Vϳ~w\mp #+/9Stwe_તtug8X"z)rZ"|^31&)s>ExaJ %NDIWT:]0c'RlL_s2+J 84$ǰ6aIfp+YQ{,Bzj(mߟ+^S=#ZvyI1&Q3VtH#+C{M H1[҄@wo&טuyfέ}8$i3,̔v60r睁N d1FjrFBai.c`#7l17bM7@#,hw]3\b4Й˘I'(6PdEbU%X n Rؒ]+A^Y~e+̞b{O`QjkWnGhFG5%ȫc@DTooY~:.-/iͣ3M6ooԱ,0rF j*{?@zZQa z>"HKu=gjP#N(uhR?k7nݵ~R@ug:0=vj.;eq nTa%""p`#T/՜\)9 %HdZ@slͽ ΃lX=tA#ih`m&oT=fm)hdNs @}IoOCvx v IlpP{9z.B׀DE>c ӹTGV 3ƨvVҸ@-a9Va܄ &OI".DŽzUDk B[eaii0 nPaZ$عQH8Bȁw;+`9 %Q`dGjVFf|Gd İ}n5%$ 0p`rngO1A{0'?TY/AChq#Sabu#YcŴU atxA9Lg%mY A$O7.R˃7 ds< Z1N _bx c`cH2JRE].ѭT|$.ŭ6,FQWZHA0XҦ6;J+ pFЈ( "%b$&1DiåRJ0 ĒV[VV HpH"N0Tʚ䀁RղFQ$I!!k!4gH-^A'Yb ]%0udMJ̹@J,PS][VV He +uT6J[#Iqܡܞ.!od8+ (=$ DXiDRRLaDrZw)7)J-GLˆxApܪԈ,eUjEbb&J A*$2 、.Aޫ µ#UXD(7(ndݑQܓ9YSa|KUl@0øv?.9zV XΨ.HcE@BL (ϝ*@IJ4G[B)rr42[lXƘNC9Zĸ)c2'Z|j]!EϹ34 '8+>kc*xPVo (1XlN"M4;l,R "lc'x(S0@‡(ƒ ¶XR@d lc B*#9L,yBrP-5HJ+ nIS iXm_WK+JZÛSIV 4B:1,iXż*Hp*:Hԑ4I(A E9Uj [.M5 X[Bv $\,ۓJd:7 , Jy9,gG]O #[rŴKcM q`gQ[{gjAw蜬ƽ,91w 8HK,XVc@ j'}V±BxIV+3HHC\-v tALjZT`%z r-)c2AN&$7@D6Ǔ#Ql!joR*pSZjb6`[9\QJQ.Tt~6 :FҺ[-cu&}ø8h_7K]U]fJ Y[Kس)ktM !D#{US8K)x,O?sgkrw?2O`W ܛ=WP_ @ v*YA[|0~߆LPR7ebn`OWocl0a>d}lr١~CxC䗲Ù m; {5O;3"x(-(Ģ!LprrDwqor%?-pbəaEF>"FPD7tR{Sn㗹o:=fn l0l:/m)^^.rƷ~mfO;?3Y@K/ @l:އ#H(so'dǙBXc՝9LUt%imQ0dtmjc Tf c^e1EE`1gUV5?Iz?)T#kSHS:RRx$^l~(quOAL~_%X2{5,aȒǀOl~MsF[/hZݯurJl~sS=t.LXzXhvqkD

,M9ݥ~i=XFK.)ӛ|ֈ55Xx&GD)|ڶ# tw>ْihl7wƂ}VFǶ5pG\9gA3CqNyA^Ζ9wBs23 F;L_ve~i"{(B <<>l[jE~E9!Tٺ?'|1)6>gwB1n;V&.TT¶p?;pgKlsY1}I?D9BJvP8g)y6Yh71׭"lM&R&P7/Eq"k=ߕ?.˧޴#*Pnģ%E{ HQEYnD:, tFQO sQÂ" 6ɣuyVd;(UE1)C\ˠb5A'*|cğ!q/̿dpa2U 9ŧ (9NEN,1ӣ$G=~EϚfIu'2_=bq;ev{PST¶D R1~ǤB zp$V J]\[r倯֟jtfykk3̘yfPE*<[iiJЧ7^:8BVW4u9B`'F̎9N^h)_fضl6?uģԀ+Fy%Uf'%䘟@gV+^0Qm uUW/S8RLZDP:=" P̣}PF@u`~ .tSx*h ҽk p#60>tLRe_Ys?+zZ(Edh0WU]qI~$N8B2Lތwiҹ!:mXqeec.|lOy*N=$2Oi;LI`+Cwp%(&s'#f,T ulw2LKjPFP7?F^5s3b D$D!\Ê+D8ɛ?Ֆ00ń'>_ds&aB4oFǛ?;|}t3^rǃ6X> DcȘ\Z};Vtͧ6NU_;xQ%s:id"=U@EUgnx0wDh7`߆_7uȺ#]C'5FALTd1 /hc2FL|߬9 DǍOyˍ?FX9G'!Tpv@-16JaKبȘT36MmFu$bígrlwVVlQB4!ТmyPENb-s@?ɫ;1~8 ;lVɤ:t3dNCk;36|a0lÀXq 0z?|񟟾L]eĎf ?f>Ҙ$WeW*_Ųrr|zsk+| 05Bќ˜axQ ޺n%#\75> oAXs<bЀd |1[ns׾47o\[8䅺cG$ƫJXiq+-bUbM1e"8)i/in&Tb(fbfKOqÉqcI";ĘӵM|{7KX:,I57ÿ oW}\$1wfn =J5{XG/ zNŞLQʕ3w Hhs5y]YoI+v0K:=`g`=2 "O7Ѣ&)w{7HJKbVh0l*fF}GXzFSf c #vG@(A%gaBQi(DȮZE/+޻Q:+H ;KbXQM[JH%q+5X'ހNSo,aI)bTHcFX:pLH"tw \뎠[w*ViXfmo F2O)c`""@ :E1Fc$8=;fւ)?V۷e즭 YAoPhM#2z_>l҅%V<=}{u8_y0DvHDY+/_F)-1Tt ZQ#W4U˹a\`I@^d}tibxA2:p'=p^zd HGG>ܢw/q/DzݖHOk*=%q8(UKvp{D|pTHl 51X-kؙ%:2mn 7Qz!'PDhY Λ# \#낷V/P!"z΍F^:Ra]2Sn@⊢>4;Ԅq)<2 K) !E<.6P ǓCA}ie]G/z|ۡ!HL "Cmn Ay ).v(n bjc<2+J%>lL ÂJfAJbQuPO@Ȱֵ6gfU QXJBGnQjMNq5V[ƃmhLx#`& 6 qc[»mPS@Z ݦ=]Lrw  ;ݽlpjDe˜!(i\jD( U'1aI$mcr0&r`eE0@;Nǚz8\< l+ыxn_b0KKߥUO>E#:Rhݐ^'_#[%z%/zr;6_;Q8a$O%G+ɿ<4 Zԉ># դ8&K9wJI@>it[˻uGy7 6N:9]Dܡ}-޾xQu8K;{Z "0?,yI>`,9'ID*xgum)$$$ú4oq>p1X#"(hJQDPTg*/qNTttEYaX形M9RփSe"R#h%b2hsۨ و(bymˤ3 z`V w؏Ce*ܮ3ϔ)>ߌ  h ;Gqۘ n LL9Zo|zK:1r;p ]WId jYaO:|YG/b}kC4IN1Tq]#b7ӭ=l5Sh#t~ow׸$QSWpa-ifUוI(U+q)јTQK8Q!{#%Bʾ SqW[{ $9~]Y1ǥSv/Ej_,09Cˣ9'{ߤEH=QŀNJF<8T.NDi/?(Ƶ2ؿ}kCN o}#4Jb̾YTΜ{8աՊ0|:DLdgoh&RthlN[(bӔ*>J}}8&?5}5$DrÒ[0ŘY%ŋ/.xq#8"wpWӔS.X );FF"lg w5L^Ô5L^zWu$v\2, ]!a)DA\"zJ,ʾG]:Q=Cd {ǽ4!2eh;L/)q`)q`XOXin]pԂ@FM*S*|¥Fwr р]Mep2.`T6 0B̝^7t|3L7t|3e8PBL ˃vjIXHԓ`HDKp0_ $˖҉dte8nPe6?P/arɑ&GjXwBmv6k("T"W(r$Y )X9NI-m~҉tÅqG]֋|T@&]/423hBQUbg<%׃iF!,5Mm5dݛ7-z[ . 7Єp%QCEԢK^݂no>}"OwۛD燧|G7s=4H)A}pSf4 5%y[CuGvKGNsnF|7o+~qA-3LKY'`?<}Qo[z;~tEk?b̎9.MGSeE&my~P̐堗NB\!B$zoiC<;++đ2x.OR]<]\.h٭udW»;l\=})q;Ѧ]ժy }NR^,^(. 8Ws!jf Rb^ (H!VR! 2ظ;͞Dz4_V*$;tZ0GTq V%g"!OfF(iA8#LF0,Uio_Jw1EX75\ab k+2ڪ Y)JELRj!sۨM T"+pK@W' }2R`\s9%I$kVEɚkSt6@ [ _gZ,RK1jEzr.ѯI`ehPlYOqRk&u%'vaܯN^D2ёۆ.͏/.-wTqxI)Ƃ1m`v[AhD;hBq $O\c)0_oޔR%iBPp\0$3EI`cSSZ5qA9Ssʫx~ ^ zkqoם׃y5EA~ng47l_A!0RVpoۃ|x|i 3+NQ{O_IB~0{o>_F ,`!Y؆OxϨQ$GtJ·ym,{p6S|Dޅ/#>? lA%(VV{v/9F; iFIFFΗj[{%[;8+ Սs+^bԗs2u4M'k1C3#1ی?0#"6 32\3rNH9Sx]ms' >:{ZLόIn{3mmS0Oqrn+laBGgrIpu 2D8JaʯaʯkuC EGElsD5{DiڢUK(fU*RKDRRo8OAEQ}Nŵ)iX]SS?| ]kjh>YjahvS-ŵJ?)`0~). R߇YL)z\7'zO !dEwٴe̩u>9gBՅWy|src/ 8uSB伸iݤ>SK 䐆o+qwO9`X?!8-#.^.)}ƿtw&:RL}/%s}J;^*ilr!Nӕ%6X#oDP~Hpgpac)xߜѷmlO 7kYPR`x=8s$<6*_f/hqޛ}Ə}?|K[QDh/Ƽ1!ǬP8 ~zKb?o / ,i)i18t~Y8;nLgpHꪗ7mzTvs3[)XgUFve%blo]jNTzp@{-C-1Ɯvg}#Gr7P%R-EBSXӝO9ì'뤼ɾ=ydvf?-r'Ѥ{)z3W.]2y <3wB|ת%ߧ0ԤDgV1tQ%hG5\*zn"l2J0f y_  !WɀJ)9\mYL?x|q? i0IZ|~ѵuޒR]&{e^gh0FbV~+fU⯘cT*I8?Ox3?o:$KmN:q[VTqޫYo3z4KBr6ÂڤʞD#Bi28l]T4)F !0j#O%Ak8S(̳sۗlvP-wmm~Yy+^)988'$Le$3dK[7n&ĶZb}Ū"YtIR1^kS/pކcNJ ۰!rq ԤPuL[e_؈Tq3f}&qКYS8U/7m%y{4I`֡;H$eZ@AtIt@Kß}CDz6sV P2b$fe Z@,ed*Y2-fZ{= QkscZ;OC7%ivlX_(HgdR;;ԆHbj1d<,؂J#ڀ QbEKhZQ=2l\!J=t-뿾{dD2%!A fz Xݩ}%6sLzmϞv F& W 1{1f!GM2.ⱬV5;3[FӪ}kUa.֭ JyL)y+W-5RT WcSиBqZ'%I^JZU{s7]KGD>{vRoSXv' t^@܌ҕTYA0 S6rև˽ Dؠ))1(K%ҷ{ٵ劬J0??LcQmcECu~Qo_>=Yf5sLó_W%Q_V_|>xE/|򷛰wᑭ3Xv17?ۛهS\T&տsO/JRsdOg|$pm"RoBobі}+.IJ()8*QpT 5M4íٺ" lĀHZzƤ"4h{ =c:m8]•vgv1h[N/ (s_i*; bBv>D h"K㙕Pjd\6{-ehO4cX4X |x0Gt(! nG/C_>e8\;5 {CJ4#Tq͍Z^1 ךu|E.*e( aLf8ZLf1}%ZݧK$nyxYNQ0 x|0Ƌ_49t®`LeA?W0AYFҾ+Eo&?W3& <}~\Ts^~ykV1i%lhN3 -%bΆWKv1l9wJpJ^i%֌@w6uQo#BVH.r"ÌAjEYT0OY#HhӸɅҦQ!Y`.F"8 m7W!#ХASQ3Ke<%[H(b'Y;Nd8]=G|7~כ<R+xֆ$KT9ٴh% AS)vsk;+0㸣腍"noPH`=By,>b!U"|b-$wo;"z#vܞPDww XIJvoEyxŰBzT8`` !a?M;]U-H[PNj>0S ?RI@*(&Q)!qÁF5a\w .ǭ;"? ]46X Z\80L\]9IcBI5 zHD8$(6âNAB3dzNJ@BO~D n>k\ǽ)78/@u"H(=s# :Gl^x1~n5ϪRoELЎ83$1+޵?/c7AN_?RPRRC 'S/  qQVJQQQu٫ghckYgaQ]<*B#0aBi29'EYY=/o/ U7)} 萤7Lm0C`3٤Ki^[% YN(ciobd%3Frrm^r9)?wU7K.7Q9,jc:&=>fර^YAޘc=S>|x)?MdHUtnki?]ߏIC5^9(pER."`u\DQTkxՂ:/rӌq.\Za0q)鼤L#'yR{l`6&'.!H踹zU>odX%ymE'f7ny.j4-;sBpbp5,ֶ>)h-j4SQ5/7֓`kz=ɘʵrUYJ=&T2G*WLܣas"3&}Tng7ɓ ɖW<]T0[c('MuXXEI dQǷW_퍹Z**ec,9V0dZzÃB' 8/0Drpڰ}^v2.A](E*$?t8fM?Y3K?MBe;NN,="'sBD-<e}-y>HQUk_ϳd3u;tf;õBGlV:lOp_Y pCZH$.:h/*sF s~"->M濱@+A%(v4@|H2|-F[>|mn0t 79HK$@ (xU"o4Pl'$c@(+_>W))i8Wo.%6g[an=x=~Y^!/X#nG+fߒx.}\* Ϋ!pnrwo/;n# i Ïooq !8j/)DVϟ_v|e< $lG/JaɅlƊ5^UbKLĈsz"]wנKENpn -`Iۋ<>OmY1 &c\N9r6_a7ݪtX6*u"%LPa6vFctK"7:n{S-Mk'zr١_%9GAHqa .)Vߊ: X<speygK(KN""/3E@A:’~֜&A_ex p"2/񱘛$)6_|vvfL7&/1sWZd05.1WG3[b(Y a6je_^+iV~#qA wzvy p?a+G ʠdcnFjOj-ળ4 noqJ}gPuw~ޥ UM*kpZlyޖ7.Iﭪjj5KA7;2xϪܳEmp^| bx|/LˏGic$UYkzP{1}L[K(5aVk|dO^Qa3A18h{\t6u7]MGTR"0>^Z,E5(-50_hG˂q i0+0GJi-CXCMyΐO`Rk_3 EO&CA]̈(a#$ :i 0uvXk~RZLDE^HmVaqc[%o{0X9 NY&c vVSP_0%%rDamC4F0bt2g*, UsĤ@ouFqmd`N.&כwھE9,uyeyb2):T6dp V0Gp^ WE ;5#Kפ'e'Y#(9]_bZ+ TfI"u)ӊ j51K -UpEp&]m Ԗ܆.s(ɻn}{b:1EbDpr3@V8+ƌX%G.,3%G geP-E4!?c4 3XT Ǻ𮷨5GC%^}Y8->&ﺱn|5^<>EftHJ7y;𻍹3xk?-VW 7qͫ?wg4e@b\EdKA9{e`IَBzs.ÒEBVGhLQ6z8 lE*ei#)W琗I~"I׏?q& e{oU(yö Py5mZr&Z"X8+RMQD Qb)ʺ$NЬۦUNSz]GcoC鄠d.NgBK` 'Y'k(q\H&ܻnzboւ!2ߦBufݼ-2DmƖUC$cQC?抂@$yRs͉\s`L1Na]7pLr^CmlJ9ڼ)(IQ֗`#_8ķhI2FGCTnWtBw|"pd@ 2Gsܘ)BrɃ_kn&*,X=z zVE!PnP]bp$a!xK7 K}|7%5zσu 8Gw[lzuٓya'0!6l mv޺IXx׶{V_N aة H}AsF* %Cs{{-$\Ӽ;{u%sOݧi|tvO <Hၡaf4Gn觽~!ppcO~t6;{܊*rqF$?;gzT4?PCopbv+vwaܡqRD~V{l'O]} o5j…Urq>)Ӡ1tԢXL#H&ȁsi|fn:jۥ+4~g7&o4z̴Ph$O⛽]}_.g9\bI3LD33YQ;P5{'?u't ϵ+뀃RG 2^jƲ8oïUY2!f  CwEV#j }wh q{⠔CmZ.Wqq;2H:t'x~$gZ\Ņ6rqrxwr|\{Zͫ9bNE څ.(RɋĈѬoFlY_9)r3ш&r\~ålyw/^}e6/7Fg8Sg78e䤲'?[y78O7wql|}~NE KW_IWg27bzX%1i澆24K=oA5d N3ǹ)Lx Mִ%kyGbܐԬq Psʠ/ftP%*̜H5oLX0B挘h9SAHwx#@;4gqfJ4 K8 ;odWA 6{E8BvPǹF׌8إp[Oe۰{x{^rg`'nCwdkfDѮ%ys nZ[1eț2}C-,,U'EpPŢd0X/ޕr9f6R{}/vNWNpġ㇡9tb}H33bOt .׮t ަKل7XTBv$83REP|襁uH<1P~[C=aM`DbǧGg} T0US'` F}-{/mGK~kMuk<۪I4#4=?1>Qϟ}>&̔x P̵ad @%3;kFHB15łEF=w4 tYVJ ՉyȺY9#"gd]43ubN,9Q")JHJcEEB @]d_-oLtB#I"˶]dٶ,vєmk\a< KIvp)4rS4`4(%ܮ4r~|K#[eNH*+),4`S-gWˀT.P9K̐ m1s@*WTӉ#F JF"aNY`BBq-!s]&IHP Cmb’ 0 :#J^h0"-z2VU :'P L5sM#jCh`w&Crv H&R  :(CtP2_S=fJ rV[jZ4T1dMs&^ϊe{qAq=+WY F55^ϊQNJIQ\ Sa ׮tPPiCXo͙%`PUj5P.8lֳmŶ 0qs([5,/ԚWTP#!\ j1MȖ3B/n|T-T lV'[^ ~ VXO"ChʩJ z]g1+9:=tϲZh\pmZ-y #7.TA@Gr_I.1؛ Vy48?7ql[9|k< ,pQ1$sj"4!uA"f ^^3\ >S w2Eb^R\g' $ʓ;O;fn)j`j#bKѲ"gO/&si$f!=%yA R -#AZ`*LeJ lCuwz>XȚ4e*=U lLVIz3IX,;Hԍ@iѭ)gVe*m͏]ySxy*=SB9i0oNg2M>~ivw&v ]I#֐J9P5y@hù`h/@ 1M=V+ \$R#x'PƻDR|9loF/wZF/09P1$,Cn7W@ s ̘;?M ,4"+B-:%2Ӂi `6*3%TZRCv]Ι}B*!Z?~cڪڒ+b>;h[ql-=}e#=IxuEڬxu.CjP_o%-߿v6 /ekm:вu~Q1Ӂ/b|saIi@NeFcV['J e)9`E|%*HG#72Ưh;v CGo_>i-}cF=Kw_ϾK^Z/ t)[=!P$9U3fآBPۇPcz[2T#iW0;9ORV{0|2ef]|p^uSM`9zR.$=M߳])t(]Rx4}Ud]w0p &e]^~֣#4D}d܌qI8Z}ɣ :^ Eyos1 1<D9@)D1:QhØ::0\mX A),&t&B]MGКew#`F8BNY4xK@4 CC܊@S $Cem&腡LSLٶr.taW&ؚO^m#aeDєLpV5E7QhvrJ%SrDZ*/ZS+zx,y*iT+'>:_$p7# X)U1K:sc0aJPTJM !W?[\wbRv<9n\\v(X9loF_j# ?>1`B*vؙmzW7u7jPp /[oT .@9#V]/8?^k !FܹwwNӸ;Y~~q?%Rj/\I~~i>?>=e[aB///Y=m)@Kwgnq20<\]]|n/ZODq`-E)h`R3xwvee ' owu짇|WX2kHZOйi#){w7{\/U,W4lZR-nv =^C ] ](AۨeC0 ʤ]ڧ5p9~ljtVn=`GjtoΆ.~O٧TknСR%Z35!F{JXLQ8Icp& ,'I+ Ϟ^tVגwnflkB1bo\ϔ詡 tpjZ,Q~I0g1Y|iz8Vʗo1O 7?qvd0J}3j_VQ GBmwó$_O6r2o#/S%m| KxlHsAfq 8w[N@\i ~ZOko ?Y T2l`nD!]ŗ˛f9k&\],dzH_!-9ÀbƸgeB%JMRt7%f1D0-TEddDfoRjoĎTؒvVm--"8fn!cE{Oc[xƝ7g'< =c r0AmgAe]ϮҟҊQҟޗ璟`$d>⢦sٖOwd4VKĆ.-hAC׋XxXJT%IW tc߫ oytnCeN'aBP^1te3VId*sIE$j5bG Ĉ9xM$@F dr&LZ:eGYzJ{I'S`"+gJKzPJ\\iK<*K#* {m?lҨHάc/2 פ,K{L4 GtmIho~7T)&&9[Glf5"ZI.{Z6s'ecSB?&$)X ~f]m^k?$GȜtg:>ms6Epc Q33t"3%bꣷʗH-HC9jj.@$Ee5:%)AD(%E͐9" fh Aaz(D-Lz+PO ϐaZb?G?p* R/XGɆhBf%񗳛 O_6!lUJUj_MwrDk)v҂FaMAT<%Tbj45$~-q[tzP~-|j;Q0ʼnaZFhi|7_9_ʠ_g~bu%Z'"} )y)-I_. TQs!­q{ez)3H,Gj#\Cm ug2F]]~ź oVf3OrUnpéK2̯QGw>P_4cdzssMKM"/?ijϮ>)_:ţU )tϥո}0 S6ŴWec3mc>Gvu9y7yk.t,aP [ iвSr༫]8Z} ӍȮ8%eBW.&sR"56Z%R "q@#x,wped L4OGA іn7f7J@v(( įYivJmޥfan4J5\9 SO6ul$RլRdz 5&"=CHzC"'/)Wɓ֞^H}3Ѱx`?R M*a ȻRx=kqF+̆/Fw:f;w9!uhVVOYBnnEMGo ~{ " Dj*_w 0J~~O>bwܘǕ;w+0%;h+q؃~yt/ )ZBbS;QR- E#0VECbqdhjo/Vk !xC%xo W^(&˭?lRQ%ȣbj5ջ@9c\;Z?tS̚+ -'&`6X(-Un~-wb, +Bd`ȩ( rg5gXRl1e7r}i\fQ=XzzBۛII|31G#*nhfjǠ*1YϚ *pZr/dKiæ _JX}$ޗן< eEhN"avyg5W*Q @}f@C-U|ƗGqzw .Jo_Ui,xߢj9Mñ09p7PJ (+Mʀn4-`&ӹ5Sbw6QT(Y!TR*uŞ1ٷO-V2N~iE ٲt%-=cEZ+}wv{?|*yq̃Mf8E{ZDq'[ R h!A~k~ڧ$wqɎ W ¸#n]̕}[Vl_Y73DxW]R+^ϭ:֮C0Huuup"XAD: `U75UD`Ѳ/wrrZ>mS&yۧg;.oXЎ[c6;r$HP,yj}D ejo$ߕ.~*H,=^Y~Y4jN[qiȠS̠M*"ysHK hy0 Eԛ5VA&FGKk}8vU'R^shZ]̓O_H^L=f@Z5* D{j-{ET-ܖ0b{.QB{ ~j0iU$V6⤥prVjǓ g fMhv~ .&"hm=U~9 4m'fSj2D\B $nٚ92OElJ< v&4*> ) `?wY5_ 1zo\ki ۠B 8 eb!)>kOf`j .xw~l5ygY~G8L.i4r۞3A1՘J ?G?urlŲjG٢--~H݌]xa68~#*<2Gz(;`&0wH{@FE EiR2lDYQ9ҷ፾ R)PrŘ@1d)(M0V+1<Æ8B(O&9(븥0p7>jCnSQ!X=2Zo*R`.Pv^c"s,@\JGg5CI.2TJAP1IX1DNENe#b (Zd$ X?ckjF'^lw$8ӸU6D3*'b]l+ᚪqg#W!H O_**ˑcFFXL?6Hyِ I@ű.IIl@$zb0n9J=4"ZA5V1).Hez++O5(AAR3̕]-dF9ZC+ ))8j H-8DaaqDĆ֤f-FxQ l;e@iϔtg&Lnt8qd>@sL6R3dwxh ?=t9l `%i9d\~y4,-_ױsOf#Z@?!}H=]nU+d}X& eѝiyf~ )Kڶ"iBjOwRv^yBQS`d yDp%#JQ9 ۪N1"ȮZQ~/8]GW*?dݣfن=ɩɩK~Fsj+n]׭Qc6k:ͱ@u&ñ*ɚB1 (]QvA97Y4`9@ ҝ+C!B^=o:ئ6rYmm2u~@c.tfv}ZvF)R 9(Ssڕ(kYxz´` 6 %ݱ𨍣6 ӂQV,Lzx,T"vjփD!9mM{h=Htwcu/|c|l7MXᨔTtT9|sr7}pTW]ZV2GOwn&{2bdWJ03g)^qqZo4xC#6@Λ:Jdk{Si9"+E1l|l}Ib18CU3Dol<+ּAG88 =m o^^:oW^:4/wl%,h ^>93wуMŬ;o&stt~n2z(rOhZ-Srg.krTc?M鱬C5jlUg%"r&ehdB=+ժO;gj7(UR١p<|62&45^j؆Hd.grL%*z{ڱg N g^ }Ʃ{W4E#@@}cK~/k"8d=xazOE'V Ҽ5,kr5*`o÷d޵6cqVxFW`~T Vawz԰uM&L*jsQ6fiRQ[(4uC#Z0]Ekͩ\co' 1b`3uQEh$rs|k/j\̐OPdu¬3=tyfHlq#έ<]! 6 5U1&ܷ$[ӝ͔EO%< cĞJHsm ֋-Ĺv[e5W_+4dC.l0Oz;g~m}[wPpVQ QOI8hDp-n)Pj:A~JL -K1JmjH(Ʒ5z[L)#* yBb1 D$J(A$JcDGj 赩FrE˜|ӧJ0eQ$3DN¹2tÔyL_6FAM?vYu|`65ʹ>9c̶b3o~/a>ZW9Hq@QEG=nMP@ U.Id>#ɚD4!w߮itJK壥W)(w-n*yv%M\V_ !{ 5i\E I)F2H*F WT^&Q-JEDȮzs+*i:Z}ShwIdstu/T@$s`Q Ε8_ FFy7 DwVުc[lu#֬ߪ'AV!WHvn-{$DCz1, ˼{_?T,zJyޚ%ID䠦;1Y1_t]iecv G7EzϵO *KjO@H5GEզ0VLw+9;-7l@cKb.F շ:"%8ȍ=^*w :hyN<̈́^kAϦv2 %qVv.B3;woGݝ5dI2֒MԺ8m^iWvuy0 DXZiϱ"#ޖf[Ve,#P0 $x*C  0^ebBBNcwX\;{=P;z2pR{+Ê1JG B_{6-' O7{(]yӞ*u/VGap NXL8o&Թu2"(k`Yuqm*ѻաT;啪U]Oܧ-5F[q"OryuzP%,n0*/}7+xP `Eϣ%sf̺FvM!@[Yr>h^ \mcS ~D7j$_ۡ*/̯;k*V<}%6>|; ~U]iQBf4k_Ow*(as2f5DV+)prݧa 9&] \7ZEB,r%\nGRtaraw犢< !/nq3.۹m~MfMέ0"l鰟ƀ$T+I Q@8@D0La@w|%6CR~dAL[qz@v: ~k$ĺ[6*$ojHtO UDkqZKTv"v?լJ2+%c"1'{b4W=cD5xH6Oim&%^ 0-nP bx';'+}d/LCUN2.f5I eځ׋~o [J'dL׏9PC>\Ϩht[2E1Hښ8 %I" {ob(5F8Y)/D>繗{n 6(Y?h6߃R[^/IC᭭\̅ѓrwk nCŰՍ, <[UG3sNjHK7kO,Bϙ6fB:Swj ܜ JuWvU+Xf | (48531;myRf I׍b^54 DyaPC\< (&[ޫcE]7r})vKNi=t޵-Hszo"c ϘoǰyL9J'ϛu~Idn%յ''́)V)D@wȀbJX8F,Lhny wGCBhw&Dd#0BRI@ HX 0b1U)A;m=64YrYfjygm>TZs^- d;VM~o "dK.RsI ֆzSI@fڸ#n|ÿ5桅;I43-eB#_@bF1a"3p\ŏ")>\fe`-t악z'sc%cU*JZgS|=we :k{yqQO΄f]G_-;W14Y2ݷT(LeHeb_/5Y`%S<`+bLo=z1~|o3aB+hnc*?j +vR(@Xei]M^C{kAoƂ}1s/Ֆv!TFN+řdH&Cx-+'~w8;r )2;1Cd Ay—&e7I2s؈&(D`#ϗ4@DD14^ޔyb %PA>ke! P($[D6(#WYM⭬vtVlvӳoۄ]k1$@↔MƔks7kQPl4:Xr2B dBOݑp͕_/$“MUWM[MĴNA;|{>ٷN9#Gl0m/fƱTk==0xW$=s&S/8\'dc7i1_GiPÆ1OU]ҍRp\x<\VrWle֤aZZ,gcմI__qz9'/l@;vygM^֊1Uu:!3)(m_6US<\xu=`m0uIr V¬VDu`Rq}:I.[Q!$h5|ljʓp q7^ ^ۣVD`V$H; 2oOHL(lưAr]d2]4zR*gQ]w1[F$Yr^^~xhIBR 8:1R U.qjV}sȱ: 8# f67jpN|;*[6̯۩"D1 6z jt7Pz3?fA2[PDz|??u30@Ɯ͙[7hL[<`2}^%7ysdzxwJ=&tǛxLo6s?YvJunݙ`p)7x43{1er ?Avzrp~) <(P8gJܸ"˾`cއb[] 0Ɖc/`Tw/x i \*%cT_`q TRorg'8گO -{qt&>II9E4Í؀&PJ' G0<Y:az*%J/9A1s,^@^|t~kD4;2sϿ(V7_i(ȝza}D,b٢)?w,0mDw5|avGi}3ʞ)0#eߓb= pGyHK4 i4{?+ 'M@O*7`&K=aۉ &NER 56r>H Iq 3('R>{O觴j_?_fKqzϩҁ^:ƈ 0(ԣe^9 ~4 3Λ%YQF67s`) =[5OIJ8"p"#j13I2~t44l#y vcāK(1iRaHH,[-|D7k:Z+_R|IU%UɗT5}I9hz_ZkPa| YY b\"Nx1x4?hG1ƬHo DIX | /ٺ(]G/'8ҫ"m}I}h<|ܘyu' ܯl+͢HXM->]䆾OߧԴQ!R=-߿tv?_.çzYj~ono4]sRɉb"j8$C.F80-}[^8VNLV±j^8+}&AFK.j.<5GN'뒐,ڊ.Za~D\SUoC%a%?mO_bw m?q-Iq*bA6j'+&?Գ@Y£Ef'BƂbaj1bz'c쐈FDJIvVFzR}Q kV"c?D&8DS`[4eȪ r(s1,`QGY1[e V)J@W ĬS靖rXp!A9i 6E#F%"9B-LJS`K7?/n?Q5>ʸ0E8Ny#K{= o=TWQ$)Syaxq8@v* /]AR M$ XeN[]a,^fj1m%B{W4>MNޤ|iaYdh,՞1VE;VF`e@Al}}ߢv֛/?gW|'ȷgrSfw rFYn5>KX ^z9*WіϷz܄ !^~Wك|0}nAݺL'vsq5z{-hnRp-{khGǴ3-՞dxlp n18!wbݟx }R7wXr科0 đ@&!%DJ9[S*KNaibpK|=PXum[3?2iž322W`(N^!\3;W8hp ڬV$St1D!u3*Z✒F:խ"&)l"&)lҌ kVCT0HdHHq-h7i4*N5 ro2T:PQfa)1b#:/% 9qJf r)erH 3z)Sj 1$VIwP, y| *>hvaY"(墓٬:CPtD" SA@mN@4#kJ FgpDͶE1%#u%\{Iֱ>#>=RaGFx̝T{Ơ 4\CdN>MdRMV7Ng3g@> t^N 60o4"e`Q)]vL;'A JTz>-,s);{ 3YJ%T1cKX))x 59vTzV2En [!M4x*#|$qRqu.Qk`-G0E (|QFIVa/c)Ď\iEWk)kS,ʟVhFNA@A|QZ3m\0v]mPX  :,֏_YSc|9}a;-_b/' %F2[^+qW+$۹h͗ʫT:P3O{[Qd,쓛Y 4QTz*>DǷ["Ѯ`pʻQ08+ @/Ty~!=KE5x2븷wYb\L?<؛7 n~9Yxs9P_Zc:GMte,x0Hҋ; 6_}Ί|}77J$ž1ΘQmweցvl,PβZ3!v9Kl5;#l3u؞,F ]ck:?^{|yC};\n;6|=g%5BldZC㌹Y{ɧcyuo5'nؗWPiC5W|M6"ZK7pJ%ɴ BrIl?%^ Nzl }9^Z,|/#Hs_n% b4zxo -H WSxNvMӃo訦rr"Eׯ)xz{(䬤kK39DpT:/P祭<'Ļxfx-1R+}^ 8@GO@9 MB= -ρRb}N$ K' &R *qA1/oGIyf۲:Æ |B\ p ue7;@W(-Fa^Q`Bs^AwS WpD 3zŤ,=Pt%Nb\h}D۴W*_ A0jȸ6!='s<*N8]N6DPOj84Yg nvcZ'0vXb\D[M1`.'<o@M/%~_WyQP -1iɭv!Odd*lWhG;ŜluYV…FyuRseH&قs_|C3 f,ҝl8*Y&&FW0䠑hV*HyӘHMRaUwy!sqt"+Mg6hDoN DxnnxmxKJp=4r5a+Xј;X |u ,a(*R}doNAm*|*;, SB\e5-.%E|b)tݧɣf7+"sB'tVэa`C0{ V,nyk.s]wֶ_#0RR fXJe#:\n'Kn [H\~p}_d8&l@=G]x4|PAl2kM62Cd5,`Fٞ[!5%VݨB{}T˂o.@LcC1ӀP<`vzCfÒz=7Q}ِuQ?7{($/b:~SySS05 ڲ3QWQ\q{U#7~ۇ N`k=[z鴉 lctzH׏DA$#SQ#'3"H|BqlR1n}0$_ c7 yF][qkjgb x渱 v} 9^pE{O'zR, <越 &0x;0]FYGT1 2w}0;@`t*Dǎo דSub,ɟ|*qCYDeg49ΩeF+(v ߮)jdRx?'ps0 QqF=(˿#\Go_߽iݼiinC-{' ßcǩϹOcW۷|N0 bx$AQ-S)}|Loɗ$wDŇo./n޾кY^{sѺ]חޞևZsyv/պ={sq=*\޵^^=xֻZ,-㇫wo.9ߋ;6@4J :no r/7.E<<./$&ӗ^̐7qi?7= ynո[(ҾܽޅvBGwTz‡-o[/c$3H5qŧӞ/)ġ(Fx*^yu4֎/ <~٭=HĄx}<v`?7zB Ս]QAUГaTi4uVzg/-D9Ժ?>\\E# p] zvN+lXr!\aDc!qw(+~{n[v4El߃7Ea߅u@NhKp'|@ yn=Qڿ`2h߃#躲SJY/nd3QAME|܌zna &}]vo<'+2a$jH-HoǂFy+DmLR-Ou8,0\VzmWc!6T6 wO0QPEVWZ iA gm{aS]3S]ZlcTIW$>~7ji1"7-P ='nDqS!ʥ(SBF#!Kӛn=@HxJM6ʧ :MZKxg *W,nra:\i#2M=F0E5a23\Ŗ cUpj1K#-| 3L`w^XcoڏvWR(,xn c#`n)ȸbQWOVޛlwp#5QR:pc+h 8 N ;LCvv27#Z1(ܚ\A]Zi?9&~ mN WrcDnlS[اo{p[>3#gpj;p ``ÃX<}ϑ˺W5bY7=KwoGd hJ, ,9kHv5wBK''_9]9Xe)/ٗ/_?/)ߣfEL?5[2X *i6ߝ}۫nbNޞ]݄뫳o;`#sm>׉6na %HUo>~vZ$w6%_] KIH(Չj?`;9 %^ՔŶ JlGA:C w\xAbljj;tf2 OfMZ*kR69Ȁ0z6v0lj n~ z3fH 26Nȥ̧a3y-4o(S&a"Uڤ*mMݧ*mR6M"& F#?=c#2[A. kahnXe&]ޤ7vgi{9Ô*?gR q6hl+qtR mtrMnc%j>BJHiK.sA+; iS;| H ]xWsȇ7nfqpj.D*|v@u]H>'8(4Fs!k7U1JMqjr6+*wv,k`٢AgPvlߍP9_i%1"imZd"G{s D/btu?^v PȪ]M zI:i8cI_cY7U)@b:Vl9IgJ{۸_²/!q&ME U˒:#9qNio%FyssڑZS4kF(zԵӊF QXUEg<="ȊϘ,y}ȋM۟1CtkR5.yFhO< >"v˅ER"qP읫ˋMW'<6MD%}R:>KHN ޖ(kJc1{]G@W{DJ4H"-1ޙF=.2&)~Z <'Ȓ?Cχ@t>NGpRx.l/y,W=Xxx~vlOwOyE\KRc ngrC^n9ܔ+%Ѷ)Q 5D! UP* rCXe$Èp)2֥dtEC3vt(ZX]!æ+y2d1-\KҵsZMtP)QrWt0X qD4".Qb EXA 0JI Eax!=yn! L1#rS3܏hh}0JPuRDB(E U$#Ha )Sǟ˛9ۆDvf{(#{k_]Wu'+ыܼjz;rU~ŋWY1Ƿ3۸~ŋ"K WuNwn/q̐k֚:#_vrF?PhVѹZC0J 1goFFR N*h&߀8N}遝D59;{nG8;SSMKj1Hk г ׯ?\7`+}ٴ§N/!ƞRm V t2d_LK'X>o[[Ch޺GJVrౕdl:#]લӸJCpɄGN>'+;`5DlK\!*D;Li*Kք SYS#T *@I *lpe*m>5)1@M~!&w k;W+B]<ԼZd9"(J`,2:Q5oW ,#@J1&rJ, /W &X1nl)#=2E QQ>K4!(@(*"瘕V;l`JVDF.\^kɋ x ʿ`-:Kq[o_Ϳ)bT֗&bvO_&IW=؉kc)pa5C_Ť rUȋI8JnS2xȃMyC)lx5YZ-JBT0]jZ m˯@VnY֗+p ߧ? ʵ$m^ Χ<,%mF<{ff;2Q>&NkF{!([)gV6>$R:iWg;sp,O'폞f+}{[asroJ0j Cͅ?`V$C‚nmHg~i/7IFV^FgP[*(^fl8lC> vCH-Јtw5>N"X%N!-^k<]iuMתYR~c׃j |elL^!.Jc6p%j\O7~ē$g)풚|ve=~+M k. E|o8/z_^I14nM%8uY_vہueڿ1k{u>_'ߛW^t͇޽x{&ox:_~wcϮO?v_~_y[ū__~,2.iO?v9-o^нw>O|ud4U@5zZ-{QXSn_N}ط?u?z5Z=^탎׶5~=g9(ozld~XQu]TQzƲn׫`^wԵU6>hHW˛/+n|H/?ѬmW?&#׉UO`/11*=I^ w; ?v~dngM5L>oִ"ڵn̗Ov/Un]0O}{ߐ٘K23og2?;3A%Z0H cf(ÅӺO^aB ɓ+jl0g9P0p@:'-sHá6qG'ŧfJ ;rOyDBJp?54|['^ Z$@v[cC՘ͻ>}TȽD4OBۏtJ\hWO)fiG #4Bq߄:RHQ(e "F R!QԔK! p(4bCc1 ׈JC!r{snEިeu-^ꝀT#r rikvRZ#Xnw7" ""[~}2G㇙bZnhs0R m}QPA#R ~Xg_Pd7%c@jE;Jj! Žt $B7ؙܠ#JBb J&C f0cЄsb(E*.4~uɼ6ΖAƪ,. 8its4]9QZgcts:|., "4LB +;Ua?̓)0ʤEƄPZF@ 2)1E~ l :"CBF;!BZh0S@l"!I`O9"JwɄu_SIgnC%FS2 Zd̲HUrpR7khP]'O{/,Lߊ.qAs^[tjܑɑ6Ho:8!{bƫOioσMN^+9z"\Qt[r=pR Lh+~4TcxYǰg"G4! -of}QN;@)4h{hZ09RgU^PmP/du a}oq_R&xEϝyƙ-7uR % 1{cIqK صL/+Q D/(fiMn_Sy`4S1bӒj3rnSu\e;M姩|i bD0R-sk B:VQV"R1nU/$a;&pxxx<>>^=3J89SlY c|W&pKTtu%m v:QDP $@Ss#o" eQ^ZJfgIջO6bQ-CO\IԬY=q6T  W:¤jҮPFj]rVrOJV[e%uhKGp.7Ayg#{GA& 1I62zYe&XF e)Ĺ"3MXW<@ёQEtCHtmx A k R)ҺzZ F35iyO3&e$&DWmGFVnڇ2ețD)TX@&R)JƂwtb@im2.U6rmZs;bDDG3bιR:'<:4ACNQ8YtQ D; 7R;|+HVh sv&BN@-Y7|vF/2߅^ 4#}y)JR S[ 6 ()Fuv@vJQTQ"LE!(>nL"i LphcCRI prǴƒ*1}L{Ue}oSaNfjʁD`2ŭk-/0 Myt QHѾU]e**i` NI?FؕmfչLض}B[G\LN0ce|qV+bpJhbc< `̀R8Ah scjәjS 帥( D1q22%m5* =^8\>PT 9qX֘'0fNF;3E[mtha8e(ƣIFcu†t$p|PM;'9ٞ&h<2TZK(󩃜D]֘ 1K[ҧps|ɲ3ii^I4C@M`L` 1>Pe &2"y6sOmB&)GꊁО}œ3k!}ږmٗ7vv)1"/n|mv9A0}H/KΖްVA;׷IT1nO]) #(0d B/JɆ+ C3yy%*tMS^~>԰Z(#35c#?Jնs iyJhCKHJIh-!U!V̢D![([&R<) Ar~#$Lt+,qO=T`#ƴX5žܩ%] ^?B+'V(io`uUJY,R~:T۴r [ݎ6v@r}"K m5NY?se!jvыp XgY hyR0;}tŽ6vVCtG\|jc-mbCM9Na]u0h?NL6u4@v sƉm E'ДHu{gڄ^iK{O'mGF!)8v̈́ (=@~څZ޷?&e*t3*< , Ѣ!?x+_:=>I5mn ! r}iH߻my K/PzGG/^EwS[Z;, :~MC屓VKzݭN[eP#Sla7\|oۇĭSg^Y˛Oڈpu{wDy)0cDv/jI^m8 l O]㺾Q"?<>>bRxܫ]n7^_vOՏ8-#` ΞRF<}>:^VCġ^#c֜j}TM||n !ĭcF.s},=:#xTz҂OI;Xv4ߦ t:=zpROtN*N'OVO}_e謫|lSzø5brLnCն4"T?&sjc4Jni{C {4'H>?czK}j&Յ\}X]xݮN/¯ ۇC(Fa}AQ`QwlptBgO}]Ag/IZ1j> EbIFޠ: Zh7QKEC:-Q98G :i?&p5 _._njiG] W.s=zpnGo(᪼VZ75s?S XIr*E]msk*Rx;ZW>Al)}-;,'Ϋۯ)CT&!*aHUp5yfr ۦ1|+LS_ʞLNV˙^@VjUD nǥmGjnSҹ_ò,\a:grgΚVPͶTTоR5GHz$BW_xIHS9J*qI=>ltN&D!vCk/ھ tfƄçg945ߡ ӆZ߳R+$> ( _MZUu޵5m$˞.ϴp*R'[qI%)\D).I DH@ʉc 4o;;MX;3ulKKrm43K__GfNoY#&{2<0s$ ƒAbkd!CR1c6u?奆̎RgI*Eך<=D ?NvEʚ2}dk!FuUB1l,Hg"쏈ێ~+6ad/*MHU1̼*W.Buj!FIsXi'yD}>U ԶKԬj `;0=ɻ$dg<!',T3e*KnY| [ߎ'AWw(*^c@1>ne]"ni;CBΫ#B{uQ dޮ 2f#o@NI5 V| =a*z!eI۾#c>OwV?vgWPL5^Jq|Lo~*[c : A"B\,dC}[@[[`*OvXcUn%0[suVZ u.%ǃ:jZexUEF`֢ 5=G p@@PugLnUQڎ &.Ew* ڹWAs ;<"_" Ĵ!쳜Uޮ%ͪlomVE\4Uu? 귋b+ MGGxw>4B7 |(]*l/;%K٭̝/ NDAX nNDrO&UcC6f|l]ݵ+0+Xڹxv\T~.o:>¾vG5̜ ލa'f~bj:.ô.9pЬvZ6hFOn}3irWukuBP"eb&U$Pw)^oԡY 3 FNr>]槿Cbd^vH9Ny1t&W~E=}eGK6g=Eʦ@/\vf!/oIA李n}lےuD!ڈ|vv_%3J7* B 0B'6Y❗A@ɫ9ׁ\)5oѡ_G~_] :p݂+xEHf T4ߩ:q|vmZ,P5*)1q~E19aO\S! q| \t.P⇜*BYg@,5@DЮ dDsdxFS(" :sF6 k-*a좽fG{noG=@ҫ3Nj+Ԣd.CNeb^]ƚ qlϟYm/MPڻ)HXmXJҠ|R2mrTm!IC8Ȁ'&7gR0/t:/m[؀0)f֋a4G(Ӈ+q)*znPrk|䅛qcj]և8ZpoъZ*K.׻D-{ɷ٪ebO>ߞNgϬA}3 ?[|z%úK\ީ\'&"`4ˢz{}ZѽsF=S(7k7Cs$2 zݍ2Cqvd~Dt#xM7Ş}#wPω0ۮGtb9l2 ˚e_a6&=nAs#6QF? MDv%o|x Ft&svkv,yH!?]hLr2G1vaGE}\r6i* >O-l<<8A d5&I]J!=$Kd􁘌hſ=:'Sw0I^S$NƮT⽬;$)%|fy|Xzo;я^׏fM3w#YH-h0@0Tpʁ#2ɝq DEjhqZNOO%q>s|?=Л%ݓ,ϕt慪:ԯv5^N55x4#_kl{,4Wi_^N@ \M7S[-9`mD* Mv&҆rJG'S?ZᨎE}~@N0#u6"amiݧ7|GWLRv082F>Fl+m܈ֺi@]$hMhyfgUhES_a`z] '@Cv͖[O/ZRpO𤸄EtKB1ܕ%jo˿a6=QN;4zgyFMhOt~VL(a LUDžj}9k[_d_ӪZF w klsp^!lߝ;>T=`OA^Zh}oŞhjW"Epy[hqՕĕM+5XsIHE[̛7*d(nT)GudE΁-x{_k)lD pB"2&yj<I)La6O1j>Ή*"3'm2_Q;g(J)TzpLXuՕ_ sJ[N2>y;IiVPm}wэF`6 P#$y%B+O^ _Cv`\ dkhxL90Agn'O]ثgd'h^ k#M#|-TJgnoF1S2FR{L缼7OPC\kg{OM{=Gq?K,yKiE,%ûv?ίiڞ '7tkFSމ4d]KwOIФ54Ii6l`:<[=#(g栉V,Cy\OhHg{+z6gH]IzʴdUZK< =p}s ~LWT" I&y%4OJx;/03此|*=,")~C@YWh)q{HtS5!'V6nD.rw+j3U`>cXn:6<_ &x\&2Wk\O4s.8Ɉ8<ˌ'0G})Yfa2||(Yu"}|x7=?N;O7gNO'8t >J5*TD$e#VAB/| x[ ήYɉē,84U k4L+Pꌰ2؀O q xW{KI 9 N/('!$$>!'jH(itӌsg?LrgѩS MF 5G9ӬjRxO0R¥ԛH VYŜՁ ڴ㸳NوbTچPr(TWk鶄 v8~ʋnP}^nbr2;",,'viWXjb|OeDP߳~Tx`G?ↅ#^\3EGlz=yXn#)M&Y*~o9Mg| \':Fӓc Hy47/}q͗"DXa( J{QGϘB 2e xs2_ur)ڈ|vvJHS.俋BcB*yagAQNiK_gBX.?] ,'O8IifMO30Ai>1e'䜅OV[4( h^&J3\Χ5eYK QNQ8p+2hZqѓq7h+eYxibLzp&f1,]u)ەfI7QJS%L!1\UzSR C:,C:ֿ-Ĥ5 :XlI֞4hu\B3&@ȵ'ޜ?AA@kpWMBUdA55PWO+NM"qgmYu3&/ntV6tC(wG-(D[tiJ*p︇:|;Q&tT1Y~7='wgT Ͷ_!R2,6|=K`#i{dS}B?i?h3PC tNL)Y+흴BcJ\i\'dlA-5j!gYy"гlL%)#peZwx),cքRWSV)՟FD(A\IG]i!PSp\f)UJe ֦*n/JsYDNc,1N\(L DZkeR>#E5il(_5z/nԍ~ՂYi[N-W2 Z^O JN3i 2Π!Rp<%^2.lW̚>uD-.ĐQe2>Ӹ|ϟ~}>1-݆@yPz#!n9Eir]gza~Ѣ&d2Q(ՅYzu.m!&/ֲQ~߭ Fev0?9Ln#9drf(nIma7ܘ]cȩ{J_d慮lqܺKi, +,HTe`zͧDXu'TWI JCIC=Љ8Zisin;Ў<7>g--xw*C8V3^RU篡|Y1ρWJ׌B5># Q {)J sKuNDOdCPֿ:^R'`LO xB\KF/JBOzylP"KT\Uy3J_~_P_hSEݠbA?QKDOVUPвs2n|H?5̟w|iB7Lc\+vuY7@>-b:қh3W`<;wfs%rvv곝\W݇w^ %VT$7s\LgzeG%;J鉺d f.3V $0OQipExZxA`|ȟfv>r ')Ԅ.dd1)p 3!2&9`IBHpT j##&rgyܸJe $)ӟ3 >BEC σV}9E+(i}V@fEɥH`ơk*q]rJx |A AM8oЌhG.)NHA QfUhζ;p+"?4:(FSpA]M |2H렆&4QC(B`h„@ ;).榫 έ4=ZQ)j #5%w]N-w!7ǟa"7)c{ltApqG}<:m%+Q8 k^O?FէzpO8Y} MU;%5shS]b\>\GTr ï<\b }k}SU]UB<0뗧iϣ3|ISz کǎ?씕73+o,>3ǖ` gMǣ("uWC ~Tx~(϶+0{ L 7 Iku5WpԋaHIO,". _=.{}yʹ7U&.QEqK֨$FɉnGQ]i؜ǜKhe+J"ltQlm5N2qrx:P"֚z|| o/cv`)T+faBIŽ{M X;pqdWa Vrv~bfPq>P6P]:f`b-m <-Ɣ2űciNhjMF/|&Uz*i%P%sV1.k~1M=S|1csvӯ%&(谼M2M"2y-%ã5iᠨaij9HLO͑B ?m8\rI%W\ŽɛgC'#9dr)q>p5/\wÍ[o)n:-mytŀP'?yCJE>*i=VRř/AlVe j<~xv_i՗ݦd CʉS*0:Id%p4WR;]{m_>D4tx=S 5KN %pl+k4)Hh굏8(c CǓkWhQJOJ/[Ǹ!~ᮈp ,Xt-΂Ey6}'i§ECL87<1iØ! ?8`5?>{ XI"M"N` q*pCĝJ񁣆p 2=ƍ~(==ļhaBK\aT3/[/&>U΄Q\be9QZ1;f/xSKl?\YBws;Jw\Z?jjc/P !f #Fb8ukR )J#E;9!hAUMR* U+ZJN+$UgY)'*75}g\ /<4&c (Qh q8|CK?wVlB?gkwqC|0=5^,},B;_z=ŲN ҔAeA%~0A۬<:RFiRr(Aw"/j֞)6Gk)n/WaV w2K.L2yxeM>Ttl|R8l6E%W'ExQ1y5XF#MNR 1b]x^l5Z(S􌋓D`Omt 蓳< ±'\L5Ƽ$ Z,zaH(A81WM}7N3qt^sx"Xf O >hQSƇ j;F3$EJ(-ohw2la@PP>`Dl0" #S16 )HMWvsݧSCڼ2hE0|>Pic8Uak jPCcPF[TR(8.Aj *j*B;o2H8| #ެ=ǰL't\"PX=2 φt +snNCKIzۄ cۄ.bK5o ͭah  ׿3~;:HῸLJYk%.p[_> f=O|b73(|/WέO-Mse/v_:wGQPʚ^oe?ǤŃ!$US (֕sm؀WT LmGݎK[4-xBuz t%|X {ÑԚmwკGW$2hEԀHFVS=ńʘ4<(8c@bs Hh_5ۤZ\Á3O @r>c:,:M~ q­ <0\\a4N)M{_h6iʴċ+cU>k_x=z0Oy}xu)Sr2'7M^ &l v_03*3ltERCBzrn4Bꑐ\DȔ ) [Mb*ݪbPDtޣv;] `њvE[h%MY'n0(&:QN,<ͥ{7ݪn%; 2nũ ל 2y!i&Ը[CES2~?/~L :ɏ/t ~z[[d u5a?w4;=0to6̵ {()8}[[@G$azL/cIZt6"GG8dҿއ$oBw8ؐ|o If,m0,@ UPNi-bQFoyQvplͰ9}6]'5tZŻڡ)<3)3Ϝ30/DZ\ @vweҔ z|~W" Ag8QXl#/򠝂jY[UGf-S;-DW!wԎvpts|Gc71M;1|%Y< p:8.//ҝ ?0RMnXyPwoҥV.߻FP.Y=X= btˋ10fZ|`Ë3v)\y шvehK7Ƹ1_0U _{_|UXp"+H\0tKi_K5X$efL2?'^^KKUA7WA߾ZUzBu)>'J|\ҟ+&&zsqhs4̸Jl%Qp q/LHsObE2b !le.]sEl WfӇ|c$ׯ^at z1GcɝefӻWm꿆MTfЛ"z!ع?&IٖIJ)gI2 ]X 38wBQk# BS\(@ZY)X ,Y -YŬ ,2 d1X‰6LR6FQ[oY@0P(RJh1ڶK)C/2!`dPi( !8"#XWLH}fd*Mc0H,w$n4j'4U1E.Wt`R'q]˲u6ݟ[CŦw cm\'*1Y/gf|㣲3x󧹡!*K ͽh٥)w08(;}]*^;i"`n-8Rqov-^ZOZ`$Xa TڿfVp"r}]2*1DŽT@IzoS;2 x a(%]JzSXDAaM~ecI>[Y;X!NAP tBcg9r}~Q*)bDŽ'*<3}+bM"I'"ך;33\:'y).+(}B&8\\-U/d"eHfi8F&22 "cL+ȓZ'[Ǔ;t`q0`ӤjMq95!2H}dq-l_c@J#!HKP;J6lI-&PUQbT؀0X . `j1Bc k-$Q1 8żLr/nSqȂDk˽otp;#][A*Ye6&1nRh{nՏz 2T` C ԁ!`mUo6  ՜jQ12l8:D. sy9m'*,4 f$D墑!T> b-1Z8Ȣ !IJ^m+n9.^Lp>\F="OZ۶UVAμ]ڣ~2kU!1?-3*?eT8u58Tyj8uBwt=x, 85GgFkgXJJE Q`([h r/UAKys3L_D v(W $GH {OFˌhF"N ,R`I(( eQþ"oeaP S0{},stt_64dV_UY WRYL7UX_EաfutCml5nN_+*.VQq1%4i޸;ؿY` h2R|1t72P+ oN~U a]0xGB/2T{l\S,\S}t60G )`ccgK$Hz9XLWXnXʞ-jd0"O=SNh0˦I¸ۻB^\9[H ɮϓ+y m%[K)?0ߐ"稱$ Њ`~vPZ)`Z\_"pn^y b$aT >iq|)dkIB2Ok,RFtM fMˬ‚|DX('I,&pYٿPHJJǣ& zo^B@Jɖ.I>@DiJʊjoQAٻ涍%WX~ʞK++5|jfǷ־>}֑H ô> Sb|ZAB?K7X8cs#bx bJq kdrxW4@K94:x˹!ROsE3Kxǝ:M-3%14r&-.s(NNrG6>] #VoSBADddUb_A2_,&l6h !&}_~ff˟Mg0/'r!$y_0%B06w/f3{n))v##3/۱|L VD)2C>ywEOr;vF"*Bno>}ZZb%2]^O ^Y?LZQsA&Y3Ɉ-~ y;mgwW[)O;771M$ҫ9HU p֐*5dLAAP֔sFaCc:ŜP왃JaI~.5' !ǁ.%F8) HʩdOԙ2s>?Y*f=Bd ,AtƬCUBmdP~)ﯽ-ݵw{aq̪֟m 6 a稍7@#hXpə"iUj>[H;FiDr10*5$\3ѷD1u`ݬH7tsMvu+ !;6' cKXYNC_V6Z:x<\0aJ *wތ`6Z!&:$KDG#x]6soOf'#RxVeaX[@IMC40{i 1-N`G*q7I5*Q"ieHE"]OiҪXei ?PۮOiļSuq]>xf8fӓ L{vMy!glE]C &7iz4k`LeD"=hDg PWbR3*j.FRn^ʝKOR0Kɕ˯8ԓz(󖂃V͚+&Z7kB'& fm5DG/8CK~ۈZYI,0%%"A9:Pɱ.)L&~I ;)_ eMG,u;)RJ>%&#Bѳ2G^4PCi6qo-Tm!EDzgc`NҜW"JF)t5#Yq).Z F\y#+T^ZcZ+w:Hs(7)wUob^Fiev²B ra+|\xFNl/ q#.JkT#T*j,a?qM*XT)&JBgYI("8Wr F1^I1SE! $yǧmh:$54<䃸RA59z>zq2yvkmnm u-]Z)"$\Xu] kҾI,\=6r3 yu[K2 l0W~}Z`VÍ=S;Fc`1Ȕs[{m4uuqwQ]WbtwmQ'}Uep @#H"F=sq]; q]rru[?J q9Yy!8*{lW`:a,9|xCorˆ6.=qu{E#Gw8ʂ&'J8vIC3Bwo>t35$ ;:u']AgU @e`-bAˋNAzG$ipL Ø8dRǩR;s{]$3^j4i^#%2g@I)g/&B`JrrFC94Hqq‰PL ":Sm6*fJ;)ԁБH.Y6J/p4;ө]Iy"4MIуBJe@0Ԙ8or0Y C%h\ I#/R`&V.0"#ݼ< A0aC,⑼o2I[ :kg$/}q/oAXX5g._-sÄH`0/"7ϧٮ$,'geJEqUjV:Wn)6%OnTOR11g4ngE5Bۻlޭ y&ڦ$#XiW7nf;~bE|Š< Φw=fi5e,c]_v Y?VFu2BP'1"4HF\1QAzG Unժϱk-:r!, vKH&ou? Ew %h qޡ#cPw`k$xqs ADT@駕v4@u(?F>;V ,fX:fu,灘D1s WEbRKR@US|MŇelb 8_m$WuZ0@5۹dC>~W%㨲H.o̭Zue=̊-%h&ڔoS8e^ݶ)N77ؐ|㯿Y,W7vy Ut7Ic1QJ_AJIncp>Ww~u5s;ɸ sgwWi cأ*G4TV=BҔ!ɝvLSBe@/P=W6O5}'_)?T] A dR$5J׋`e kU#=yQH9񌅋rJOWɳ H,-5`ZΒ+]NccS!,$ᔉPlE4Dz]]&3KTOEp03O}˽SluXo//FU]׏? 2&+{ى𵨗#?,WQƃ Rl-284E.e| 0%OVQ9m)QUC4N 7p|@]loFgvqgR4: ӛqfS$D%/&R Eׅ&d4?{z.:-4 K 7~x2o9bQO%uk|N oXaM11v[dxp")x VxQYcZҊv [F:'*yuQ|wFlVn#xWTgY,ޭ{f6Ɖʰp~@ʨbIX8D 8 lx0UJ:53[HIʌ""\K gPbypHMuJCL1dG ?OwS|g)MCz0M`剔Ulz,pJ Es iAMN%󫶟QtB}kc([O^bN8ڵKrLVu%Bɜ:-}A*,U]!0N.6O4 {) Mao(cb7Oa [&A~dVϲ)Zaۨk4R;? ?G; {ȉk]~ 7/Rlؙs)/A1jt]yKZ0$t+&xnj1fZQ̤EEL'a7w3I\]F^xo^/ޤ0'vvah??F>vڭ~0GsTE/ | CVk4L3^F=i}biE SmuI =3̡HRL%?-G sA^Լ#jF5۷/<7 W]`qW߽xܼ{[/_:Htͻwoݼ ;U0NT4:{L`?SO&}'z4un7Oট0˟U3NB`2z4MS0fONKnO3%Ҹ}33̖ fպX.b.b.b[099>ܗ< m[.($%%+JoOsH8>]Go揯Ďg;֊r.دc}&9@.I}pnr_bcˑd22j‘$L,eSO!݄Kj}+ Ex PQu!(ÌZɬmw Ho~>>λYMFc2j֠*$ )fC`}.*Lġ>s8J/ x^ Ӳ1˭>Eѵ}' O.G W?93|"z*dGtvʥSsS ~-\n0oMon/ӛR,r C-ٜH4` oC a.R(HR])G7.' T*rxu|9!:R)G ,)qtpd斡q PcInȋu̅F̓TCtcQ~H3koavӋ0v0vS#v]=):tt NFU[E8J m(1;88d(?Qdr+#[][~XM {{ΟnOD4VLDM~ѵ_}F}~ܣc@/޴^r6{;ehBC,T8pr,\dٷAAL 2c"R (UľV(hQ!"=UmkQ(g'\{/S\ {/VLTBU\"7T#+\h(JG2i9X&44{1܂gT3"9ҝZP#+^{DJCMDlNʩJՂ}!`f17`՘`K՘%.1YO}rj1C_EgL)/ 7On~^h|:x*JewitqikY̫fa7HMy;e{7ΰr~Y|"ZJ;y붩ݲKP }F 5Jh^V|"&S,Lp"&[U1Nî1<+Ω"fb?B!' #z\h@}dtH #3v07+'rg7iLI m25 pw!4$z+i= 3u'y-P;֪5Ύ/q >~rr#K|2B8w)9 VuCܫqKRr|ٙgEzv-F@g _O" YђvsfDOw"FlBҸfB[i%I:H@6-xR"ZVw*`L1 #|$4 IX'#*Sx!BV=ӷ.J7DY9x6хdz=/c/Ӗa bJ*hJZ-@ҡvZĕa0%c_4FUZB е}*pV̓EYW߰`D^jx槌3ܶ6NDSDZ`Y#EsB%[F,Ξqfqfqj|tE}ˋq+X寈㜷uslu1]):p[vi9qF-IVˤ>q.n'cGTQXO[Wnl^knаXfwtW)Z^DoZoЬַ.덆jG4]FBFfa2q÷-P97eݼ[y-0B ړ7M &߭x\m((%l,f{DjÆSG#I-2B[DžRoIx?˔_di(ihlIL91ZCSVYuιX0%8qk64ƑbC -b9f +21Ow ш03?HP>' SkzPzR:ɖOw5]fȉ8fm[$K(!m?1:;3{r|>K9\r goL)f%x~vwg!=JijAϿ?zDp0O 4ywRqN :D>Wf~w@}ܟj1q1? Գ?-._yҐ@JiPH6K?toKfܠs%k 7[j3Jn|_Պ⌎ld*+*tp^=gO;;Ur$#קD i?T쨏8T/x4.&V,XVRih|_{!Ytw].۴˷F`O7TC (SP'q=rB]rHlFdLz䥞9TdG_r5!ea:M9ܲWmnm PKԢDXl. zs3ҥW.ٝ zwRWPַ{u?mHg"P)z2ȒOo`cKNw?ZIXXZ1u@7:!"&V#f;٦Dܴ:T% ũXjuކ EРZ>2h^l9׸ 8^n9 $~9__PƸT FFr/MYöGWHe~Dn u%:Bk6H_hl\}$xxX0,tۺS}V|x1=XNoR<~q2-=^]COam"%܀x@\_#{!Q⏭+r@QkżkVtur}A{}\_<و-8)iOvZqj3>hcGueΛlqc7Dge~24bl4mВl{lzSs$vMOx{gc$CJv-ұ]xl[GMi#+0)7(aX%L^p;izUX7lSH_ΣcCRh(t΢S!ۗb"tcLhR5}[Z]VBc%[[vrYqͷ2Z2q1/O <޷B-C|da%)$#<9W#Il_g t3n/K{>}{5+pfw`̯7_| G1SScjxw,\]ovEW߄ͮ1*٬`27홸YXz4=%dR0 ZJ9%#6f|jffXg6. D>=9DH6"6aQefwYOtlj{V;D=BEͳA'J:evꏝj9|\RǟUx*(%P%u)^9QG}H({%|B5]>9ǁW]_OnE*0Lm:)Vκ@8눀C _1^Ex!);h"LqRv4>b# 3bi3̅`(ф8f i &Nb c*)t?"X"iKǀb8o&zO] 6c8Q ؍"R.Ep 'KS,N%8k%e=B 7'B]$Kh[6wO֋ nLJo? >?<ڼ'|嵃oߝDDHTQ:z },\;~\|ywh3C'+ZZ  xgSӷgÔsCR"(9ۦGs^B!|Vph3Yg픊h֙A{vjƋƴ80Rg.qVYY*xs9[/}_Zq'&"7}5*ԐT4b^*!$"9B{xn;lV_lI_ThB;˰ߚw%9 &')&DLp~M(s0BtxېBM!b,1Y&_7*(Pu zʓ%5|ei+N i\ SfH)8!* hީ"Bʡd{Z۞?mW+;8M5wIW;Wx:EW41^2N#e7zLzOQUx'k[|34 ׵IzI>8,'9/I"1Asc*WTb}$o;QK׻ 0r<=+5gv|w4/A9tR "zh8JNGSbT Ԥi]o7Wp#Y$ S=K`iO"{N~ŞԚv2 [3l6b=Ȫb (DlkQB&>jF~!o7LvیrҷѢA1t3[n7$6wTK D{?Iֹ$q$__\}녻ھ*G)^p}w!yeaOMcWWf$!w CCK܃̭6d.sd`$HkEIpBs d,E " zeֳ6'ΉC낝f }&>`#N&~sa2IB֒,`,ECbS0TBH4"9B&l/ RYe5 Ne*'8&bL9jd!8ppG/RjP[jxӜ h+D2$XGXGFiGY:(5PAzʚl.wf{t `u@␙ϧe$"9)iG}N!+qЙP<ԧL Ng#` ;~:wHbBk}9~۶Gy\w{%j9+ . B ">hL6)*7hĺ|րjfB#1->GO3!?u}lb~/?!|,{dh#ŬE.YsEiyL|or;~Z^*@+$ǟNJvZwB%M 0_g_!㐾f  շT˲/F򅝽Cg!ذ=ek2W8sM P.h R%u`h|J;!:,sLsD{i D c[e1]l}3לT tG47y͇q  q\R;!'$'`50'TQC>e$ w |aS곃(萣!P S  :N%)\P_nG |FcPAC:JC PE0aj h"cQnKT_*.ZށdUFՙ}$BGLP3(6r9AXN T%j ^dRݣW}71a&IzEȧ y&D.ڕiɘhJrJd Hh1\S(鉙$0WLTX9D+>sbcv86 _N0Id 3ޤ/J+ 1F-2m8\vDFfǤ'T̛*Pp㜚 B(M"ZZ' nRQĿJh mMjsyb!a@@K)ou#r 6k+{ `-/7>6q)N4AOFeO=q&jb|Zq>h3P؍lEÉi5x#&xcr}6ڋLX28nq7NV G8ZO|R8,͏:v];aCWk~Y9Hp^@[}ә[Nx+×9a?{y|7Qҳg`F3Fξkw W]Fd, 5oNV|XLMэ8E#ORKAe@Ls@l\ѨcͰElPnZ I9RVu˳dIj&-gb9ÈhtF ,L'#a]' @2jvPA!˗QNGF/' 9#$'' IC_KhAQGS7 $ ⓎZM C U' |EQJh@)/d)RRjJ-|bhJknDzJMb\q%"SAB|&0)+gVϮjr_S-"u9z}K^{V7a*9oK*yUּ&Kվo@;+"L+c_NJ0G3ᒼ!|Q=Hŝ[_ߢd^=]>qT!n>eʗe7#2rzCWim{քz4z ͗޲m`oQc[㽟bqhJUNʀ`;&9ϒSUG@YbM~>ѹXWд1~^fGi";Η+$$<=ף:tt4fbKe>A6gϰ씑Nfxdyhy6a#9Iɳ;)93&nPD 1D!AhbeAPNk!`]b"ԿE~O~;ٽH9>osi| C9K*}igB%M 0_gD5DRp4C0.$4hAj8>Y. uA(8f&C|yߙY47ݵJ|Yw+ _xA,ޔVW w Y-WWWSDg5؁y2 yR kE&ILf^9Y.0O|9!;5 ~ G_z>'1_ иY#V8j+T"-|˼7sm2(ćK`8)J =D;m"YXrn-ѮVԭGw~{!*IZoMNyw: :bTF2-Hpy":.ϗRɲx6RѸ&]~2"vN5@" "t_BExyǤ!Ég;!1f =WhL^$ B72e^6Tqi@f}vu.8lSgxS{^B:LR X/o$4P3->εγzR1_*nt`lq;qTœ/hŸg yQ=!a}G .z ƛtPX L&D"ɔhwQ󼯏@{Tx=MѲo=æ?}.^_w}|P*lrao⯋tWKz\sr]1Q?nB>?ڛٜWO lX)Dž)7=q?9VT+gmOiRgnN;H3Z+\3FnQWҭ'gb<.|V-zä:hXg79a*Zed{æ޶sQ߹mB "TUR<*a7zػm,nȈ}{mc>̖ж,(*rc__\}녻ھ*k_m}Q1yE9yegeOMc煓痔 &)TFgqq~9 :hi5ꄴ"uǻWt|?{ȍ=9-~10l8 I^N`=YYJؒmZj&"I`VY,VůtJYlKI>u;fh]us:Jw#Y @ְ0[;щyp;jhnŒ7n6 9랹Rv0ۼ7mۼ77b$ҋ 0=w5q5d`*FT_M"_ CAIDC5*=onSen1mnGT#ܦvGNqRJ\Y"( -*xvp(3""Dj16j!b8z1b]T(EHR6*awd~Ğ={YZ1/ν4ս* UYE8ͤ^G!?x#N{/ـev6S5UQ>>@y +\oP3P;әK_sٙ}`vC'&mcD[Oc!4}+n+Ac>Čh.z ̱,5IxFMJ2Hk ԕѨ&F F_ eYUIzBΠQ44-CzbEhlkj ܇)B0(5aԠِ{Ǵ2tIfj vjFOjz?gJHľۤx&%VpƵDܷq9֔Ȉ}9Ҕ86}^對R"MYJÛxNu kCJa~TF4c9%nB-Ej1I4rI˹ɷ9kT&om4~}3Oo6 ~Yn22aKwr`;w}$B(!@ûˊTi1qJT=bkW1Yf-.'߯77tRCȥe׶E~BD_)ScU?lvu-t'*,:Ż];B]'$.aug.ū6@;R)+ӏf麞pN3*pdT_ .+kD 4b.P$8_@Dt_Ch[ RL^ݝ՗*Yduu$,׺7pIxi Lx /N?Q=I\',:ɔ1I3i9^kG3kKf܎OI+;|]4Z䥱mVX?&-Kd$ɖOUe۴ 9+o?l{ɿƉto&\-/SwĖnx :]_99.Ot3^uNmQ9l#5ZUh1ùW l!.ƑB?fl4'V N_w9|Hs#׵fRr|'Gܳ|X4AI\㭊d _ζ[?ygoN qUR.*,#Qrq3RE!*\q #1evGZ̹pt hn\`<&C)I$7#$2Ҕ2)S@0ʕ`vH0`ABN&7' >ysr0|}mt0ez3_'+ ȅeeq(L Em؛\xJ.X 0N=˲HtTF_"Qk)0 KF ePK5!/ճ >1>h`5B3f-MZ@oVȳBk|dB3,0G";RمБdOKr0H.=%S GeYfK mrI3m8DEQ L+mYT2R0{6 ZwY<+!ZMۖ ;i+,fp;|^Bh6ffOda~gU48tCկ7)=.L*54)Hf2`;`MIV] )ʴLo$]ͤ2ٛɨL[yl#UYY̏אc4FZKSq2"vK)`)IlYmIq9y|L1Vw>O޿Wb )vGZRg5 ɧߞLr.nS76Cf?cT0DR1OWţS$P)tӟn xXOd}b2T*}I[xk_ SBb,/@ k ܚl$TWR-b+}z9 nҹdכz^ !R{J qFwWRkU~ΫtRa󬒴x7u~{U7(g4 At @Ae=ݬTTAő<C9V5-F:rG0)8@=ݟZ=2azCLm`j*Q}`!"%ᜧ4HB;`Hvvd;}P=R \ڮ(1G!%kR)~HDCCJ̧ h?ZrjoHYj"8}DSaY0QDu/eb =؝a00Fr#AS @4\ڤVN3sz&2;4#jNO-}zR}R[;v(1p 9F !=QBuwF21b*hwG"BKz?R! 9٤wW^_p 86?^|{m 6=tbmleyt~OiK62mݹuDںozmINeja*tcJt4=ZC1<*w֋:PQӜι{y'B3RRu@A<ڻkdYQIvwfC8|,eͽ3_f/W@ pw tw{>4z b!fr2):;DD$ v`o)dk3,lf}i]G*6ϖs'sy;Ԃ(g)Y~(>r738i4Dj"%O/})'H)j+=w0Su:_ְ2-oZ^9{\n2~gUFO jyqYE%1^єx`MK8\I% 8t_=꥝5jpUޒRQܣb@j5jp# n]azJݘmYȩ>L"L!^ #.`N:o/5..C`f9]HZ_sLOAbI\jΫؠgQZK00M$M,Tc="E] >kZA!rs-Po^-{xGPM RGrnhV WTqV|L@W=m~M2XSAY{+sD\JXOE^ oEݧ+f M"#x_|ieR3 lM(j>XQa):8_=,Q}]T ^c./)S-ʰc)\.F e>~٢NvI _Q 2Ől줏`S6۫쵄34}j|̬֦.Ϸ;;;;wU}طdӢb)˔RIK22 ~, *,r,W94j:?CM/W=cGhCgUD11Uk0ڜ/n YܸT@,`~}l?n w4AK1(:7|UMvaA{;77דE@mX^bty:eۦOum{a)RrTCE 0&WDJi^040E,87f碤iVj".SJ B<"SnУBp(M^jo@ L8NSjԖӱi"RĢhQXM>$*i0FDܹʨ*MKB2[wSJW$F='ZlEEbq 1U)0NLcr42SPaNS nT)(\Px 6Q@% (?TL̊aF4(&usʋ^c~Vaow{l[vL)'Zh3%̗* TDj*  $pbbQP[Riqv9@{H Twˊ vA- f x7"\A颡+bDdN!#1 ECDZB, G>Up\`F@|uIAM$T+e 5OV1Dl! _lCAnP#2ꝟ6#z *L@}ϧTBT7=C5EСK-U}7dTV_uqـ9]IqWnj?4?u`:<>`?ZgH R0#TvZE1 ߰6:lcBXyy+lcyط.FHuhTsD_L ,w-TI# iؘ4Ԗ7M:n-8>V:=ꮓ:MT(hB3f('/o7Lk|)8lR&#N1j cz/& n t$eJgg%!) pL )IhrEJUd \ <` N~ۛ?/,M_\7`2[?~Kܻ֝`"r;`N Nq7G %vL]=BKPl՗Hduu"Қu&.޵>q/WU-_.wNKRݙ]De;׳ŋ> (D;~L?#mC{|~|nHr#dCFJi"#cy}(.H V#t~C_[Ē=k$sНBĝtxBɑ]vdzЂ&謶/MwꚈ9n?rc3:VP**Hw-4s̈B:ڹ@^ث]Bz<<<<ۼ)o^%2ࠨ(( TEdP0 TAګBzt]`6"cOmC^n=}btn=-=>Rj6JjIxl굤ʈ7ԓˈ諌GOn lJtlQD\{E@NITA@{vP5[MW J 9[Ys6"&rRf{-ƶY.G)¹YUksv"FϾc7d閟ݺ7#O<3N2ep^q[W RDmN#zv-lӭӭ 9s$SQ)S~ln&au Ett;bD רtJx!!g. / pWn.Z"&=: JĵM{wRzO.ZrF&oU櫇vft#662kzw$CGg(N<8[fiFv7W;x-M *s@+Ki Oy^*5xl;ܴR;q3Z۽(w5VmU%aSZPB-*p<yՠ:)bMJWOC3mȚRV߯XԏA[G62kΞ=K/fa%07WտfB5._<|j+.- s T۳7E@r7}qCAk>bC|uD6b͂rB#ubQ&Fc.Qko@ rUXPEHBd!&B-F,v[no$-0M:dC++Ea+(JY(&@ UɠSgRa6 NPڠeBdm%b,?"sB+EzF+eq6,krn|wv^F0O'NȚPXN[NlZӧMngا ],r 3IorBO4b! kiLc> {0`Gb _~&q`fI`H}f6zӝybw%w~9gr ;PMҩAڏH а7a)uDy^^2d:Ӭ6"e&dY( iU&`DguD1dqDhwEZqg03csY^唕@OLxSlp%e\[ڈ~YE}vbRg ̐LJ:#PNrUI'am-#Ywp"OIR+? Tis$R9/N^9+=`-%T#zci'z@1{$՚?şf5j=xC`6 =UE!GKFZYkFkp & uvs*J݉:Uvf?#A%΅8%"/R-\BF{pwk|wV HAz.^@2DGSSAr4jeh%mRS9/,g8SBoU.1D'ICp] @ka%8b APsl5zA 5܊tX7! e(߄登>29ehKš̓<Is[+i( */+\䰪>׬R̭#6iȴ /TUȚD_vdS6AO9XPmn1~B}~,XAWԩf+6h ׋w׬(e@?fcwyn3ovY=~tEPJ9s۠B7޸ۼeIBc@ V+rjҳ|[7_ofoYۻe6Лٷ,c ^U˿=k~(z6t^|C>l⎲֗FN%C|!;S ʪf mg T:6ly6ly3۬+lAS &F6 ܙʲ_\JGM].~|Rǣ{6Z"ΠWEe(/JT3] c<8:.M9t\#e bۤdןO?v85h)iiX=*ĭrǁ>^oqFqDzf\@yS-Vap29|EV8)Gr^!04 ̌NpQ I+ t֊C i7qguƼ(kBE0!]Y /r0Ah/ (䤥. ,*m#0vgXH]XH4o;`zcH5nWkkMRW/Z(r.TXn$۩Q}(*e) Uymf~vPވ@ӟ< .?ImWM:!y.(޹1e2+lTL"VUdm^PuCZ@Kms>%9]p E525KXl9!ECI 0b iyYevϜo|yyW_3!Կ9W$ΎI)7Ǽ;"V͎ w<fcvHzNEJJ%ghL@h+)A+HB`&+%c˭s xE+լwfYF]9K7W]5Z,сDu8)l/YTAveƇpVz!C9e`E۲vm5Cgw.jvJV|P:%:0 v8z( %nQ5ITB-u܂Bm@Aa42-OYTg!1>PMf=k3d]Ƨ daKpsx$H"'y[|=j北m&1Ӣ1f{}x^WQpF9'-5ר3`LWԄ٬0/j]ָ"t°Eu 6 ƨ=dy U>Ķ{oI25  Vq4RVZf%5;j rChH;fisjJIi,E5"]ms8+*ٛT͇$65~)+_eJ"-P )*qM͌-SD EsT >ʰ1ȳPe e{3d$%HBDui@ gR33㸆9vI\{. 0KJǥ$ ڥddž#׎v6BCɣ3@蜮5Xl96ZJ!,gpqk2s)l=eJ0Dpʰ8:#aߧ2dqxF!buz#)Vbe!qgu=H$V[D2?<9ʭ2I3)"0Ϝ)7 Dw8#92 1%3>85!Fe7,2;cc4c܊LKy;5@(@JI_vz U-)[L@;޶Ұʂkl1&"om1Z AgDöEO\:Lav!־TS,b@@;[gzrLEUhazbgA[UmɊ&楰74a`Zb)q\k5y D+ =O{íf T8!Gq_Y l1-Z#/ MII,(Ķ00P4 %(uJ,JFD[Cck24mch9˭8 ޺* `h0]T]{R֮B!dakw>?usPJv?L7lu&(tpC#"z؈&36F4:3bMoob'+({]bZyV\ͮ[=|5t&rbfdz4HI=]&zp֋F7tQ'oWytpEK$:VBO|_ͺ{zJ{j|hmID3ːnf0n>ecwfuuKI=>!|pNK %`2% 4zBp=$Se0CJztBShK H@M!\d a9p΃ڡYy];=/:NlV%b%vHe7Rw/r0ZH&ö3y/&J[? 4c>N{b 36~2Vj?)hJjnHן7X?x }xWX.o??x+Wfq((UZHdI$O' kU3<)SqП.f %K`"lip~ny;_B]ӵ;R*GQ0n C57) ]'cLBʭ) \#?^,]Yy$3?qw?d~ww0&uUzj7oJ c RСW`hJ: I9SltSBfOz,)7ѥ䛚@a?BsM)@{h [*1FtZЩ&x-+rvBsM) 8SӇ悉mU!cr-$#btj\~aN4 -R,q;K𴡍-n'*A1oo&` AȨKEi|Ljqgj8 Fa+3v%: ~ x@֓P ۂfi8_v&AsISۜ)UzR={ NC8 jv}߳7Aӓ׆F9|tʐF=|w1'/_LJ] 8I ]r%˦.u,.Dy=&\҈yd),;VКRƛa2q jAiD۞w= KPuU6wN:BnsJk+@gUjx;:m#=T_̺U .4QցĠ 5g;p2 ΈEn"qNkR%[FFjI3G|s kΕbZ6ru.[F%:7`2HZ= ʄYD9"7iòL }0x'F1Rmp+vCiǿǓ2:bc@INxS_24v$ Św- &/z>U{У`CxgNR.{U|r"Lu+PSxq EYʝb*B!0O?1鹺ϳoJX=355bja7L`"ͭɬ8~,#wy\pzXn<؃p;5#7Hü-!y;VC*?h1iC1 =O r^!`/jo~} M lomap zf"Ǘ>/ү^Lcpw_>f%ǃAV=Ɵ]3g7fX'>dJx VQ{~˰u|CV$CG2K2b(lŭfZXL P)9VZ:¥ڢ4892I8YK A> ;h5x$&z8ƿ+xRr hڨʝN)o@HT0Ƨe'V4r1問0@I2TjN5VT(\d'-vRA3^AOz`P;88w:=3Qo0|K"4 4<uy۟*9M,3 kǔ| j9vDA2ax9.DFHvGj 1dbC8FNI%oN3HN.,;7$bsƙ8O X2V[g1ֲ4ܧXwnaVB<˻f2(LNj!ӛS<3㩗NY gE  / eMʖˇ٪4=]zqZݿzx.&Q=f ٭.C_nZoo20[ZLdcfbB$D-ĤBp]ŵT #>($"?uycXY̷t Q0ۊ&ZQb+`϶J mr*p|,JU7SigoY8ΆY}ñYgF%Q\e ?jHb&E)\SWTK[ 3J{ۓahj$ \exMtLm{FuGL}Dcyʠ;Hp%њvƬYVp/03rr+Fb%Vg#L[mBJS<EHqqzD%ӹv.ArR C <3 qTo)BQ39GOW"\ZQsnPi7 KC#NX)eE6m.R˨b˱Rb a9s+qk2CmB2L &XB5E*T`"a>Ϋn4"nUS irIQm~jpWVĉ4;"[2SVo%4='0pxfȣ6CۏP6E4Xk5ϭt)Pq?Ѐs28H%ũIeeTwr#o(7@Pqb9 NrGxZobHx?d}1b+g8bk+)x;Z=ގ 1:h-vVkz~p-6 2F)s8vݛ ) '9~_+&/9b c%;|%=80$0I Hv^Af%G^k;oТ eƊV%ڡdiIVr5":R-*ɘ)@hrQG?fe-Q$N-ZЈDF2E9=nJbZ&tw<3%Oo(=e h+3 Y&d>يhY BWL6zzw(Yw{p'ۏO0D}UaC2P(E+D'5 T:X7a i뀃5W1:Bb&=h׉}(ԭZk{T4rB 'wUmPj񤌢X l{rN0&O:XS'8T ccʑP| uPXB3⪠$ri$ՎT Cps'x;_ jiMTuݛܒ3qes;]YҠb1F VYM~Iڅ|&bSBJ0ٍ"FAA餶fv;!2 8_HFn]Xwn)6>_Z+ο]8EVB^ք5!y$p:@xm/~q &W~Zq%$Ib*$Φ:5ʑIÙK9 ƼHA p{d(̷3G hH#?2<֛ӜGpZHRANPzY~@RHCg%#ƻ.[ީ7{-oCNBv[{Y\nպpqn;*qu> |C ;XĿuOm:\r5[k M|oJB5  qa#H[`}ޯ>'P.n}Bj qP8 i:e |sIy5 $:RZ*JeH]Tª &\[m#|1+wpm M ,N٭_/W>}2M:u0`ZUZ-n1\s҅9'*rBU?e9zMܩ-!c*Xe4Q0K.b>/-‚jbG3eX {dZq)ΰEid3U>6'ZN3;K\0Xp?%)L ˔PddΫra2s3*˼at{#0&)rsd y,Lc{&^y?{W8/{ p bg%JKҙؙ+NؖM,w b"HV=OL#y6d_AT(UUmOׁ0zHȬXP^JO]==Oh=uD_GQ bO>~x`^;᣿ܕ=dLdͣ?11wAF8_b>W~OyV1έ/,oO|pl{d0gwS1X}wdr9O[R~BT bRB#p:tƣH@#KF7nT{^w|4gDW#ڜ㒿A˿̔:1Is:}s8nYaHee)\ ze љw̅x~q6/ -~:5+@;xBVr9kE{ePȬW iF=aM An'M #/8seS RT>loQMl >H3~>Xf髯Yphu߸-\7Q;_ E?䈫x$s4bl'&Jiϴ-J%7ee50!/a- 2l\iֹo$8OXWwDDb"B7vݖpvqmνTI ۊ)@a hTJVVAouNuwS7D#}5Y3~;_|o7OVRBo͝˞Q&FϻhPj 2c\XqXiU)hkTwU4c.ɪ Y<+ a(k%V KQ\sge鄪T4r]-W6W,>!T7];_K@fdȒlyāY$sʍιB;Hki. +ȋcKK^(40(q_xԊ_$~2Ǘ5pw5m".ip96qpS te6hc8{|l dG`ֺUي WV;[QlխzbĮVɢԸlT|B8]"RN|X|)~. 6& 46}Cx@8ئԑ7- sȿtҹby 퐉\%*+%G{MR[U!8x>_>=-F_*P#鐏8OCwP![iPQGc*#}loBHփZ:4Oփzvn@ Ɣ[Mד+$e4D:2ɿC Nё6љ4HHTΤBxAN:a3]IP4KrW&epQ. -2l0ޖMIt2=ב%EPj0`Jl0AOw(H{ߨÁ5H6z4iKj%AS1]Lm 5Nq,䍛hMcyGߞݚŏ'R116n;!ZGݒ;;a!oD)HA~~ å #3"S UJeEnGA2ki7ֲH;CL|1F$Tgn<F8e: LgG/3>2ZjړVyf.i 0}QU#ѾT{5aN{ nY'Lȣ(90w/٣p#ntK^sgf, 2˒J걵]6yO;܉^ jYDx℻wŏ_g([ژg\La)e:;U,Uf^0?2,qq\[d[1dm㦕FrV?صA_yDN:8ivυV\R t^Y 82 p4@$SC:ژ DԆ{jG!YBZW $/x1xx**W.\Sj"TJJlԘP,fe1(\2e*B esFKbi@qaCɥlk섕5ꌰXmn "T[(Ya,+ #'*xG E5{ԒO a:%ay t3n|!`E&LR+@hKj {h̦ :< -p(w5FZ}=gJ\_1a_B_C7xRU2-F0zȩALdQD5)ZZfc4 ErZhQNK< }h4|rn^ mBOs`2g#˃[|xq{ɯVM_UժpYuP֐a-rԪq 纖a}s(i܄qu$zrBg2xh}ZwM6\nĿPVյZcF؜)kӕQYʔE挤YIU2_SsŐC %९* ӾtNU* }.*/-ӥ-kƹKíYwu>JZ]y߱z篆2jc|/ks  zf)YhOR#P ֆt;ec[Z`F(p3vGK!AbN@i }/ȕվU ]@U,QXVXh^9U|"dQwЉ3Y)RƆ(LΨ=jЅQ8uK:pKY`JGtֳx&$ɏOR[٠;!co*1tmta=L:UEfM8"{7nxSgO[ƍ}\e ܦL^kVx};dL5~Ev;HWA!vJI«R䴦5Vo:~e,YN#:sea2f6kiy{tИ&v;.I1JRJΟ>p6 4k\| %-9/Y. H-q&>Ay=*0J>j^>6B#1GޝХLTd+5-xp-?úFY-:Xu`iMKk./ 0g7MJ!@nrQqPBK[) y.YRJb]Jmh3D1$bJrW5i<^,@a<^rJr"r91U!Aøƫ Gwth0e RTkdL{^ڂ,8oGʯ UJCJa^x<:l6Ӧ 3M߆50< EiLLq,Q]PksADYT7E+BLJrir J䶲NY]". ZfXV iuJgv !a "%rk^jZՁy*]XAQ:3X* U1갏% ;yZ:ui^S|NܢԅJ49Sx(%^YM&BjQ*Zl]PC1JƬv:*hO)HY |%O$,?]k5L'8q`?V T<=). CU-C$ ?XTȪRbTJ6/D`ѭA\~>D޵>DgͶ4*B}5&iY>.5 7đ/.'Me5zհ`!%(>7i=>/7/)CkhGGZoa$FEğGƄ!H 74v8hW{~uw}N;^>e9p.o}~1 $wC;;[9OxvWާeGՐTJIihj%Cixǯ ObNA3 &6&"ghtoysO@&u&~V]6WrSߏ"3 n$`@QT~N2Y~Eݭe2 L~U,VU }U98o jPA%S5] T3VO*B(W>@,uD$(Mf2V8"L,:[=2lc4 gHA ݯ-\XP pAHUSNX/ UAp7܊/_n[Y8L7?zh*O&n lNja?HY #gࢍ$6Ւ b2v6 X;Rzk_)k pZX3.vG{X BD'v6m,Z[|m@BB.\DcdsvBnNmpc?->8P ˆnWZ!=H 9Zyz2џg],63M !.'˔&^ >uK/ў)OW)>lC:U?оrqWr}]UʮOt`&n8<^"$QX8ukI)4MW)l  MVV"-%U:9 +ZL1:Č5[B|a46)>pR)otY?jd-{A;uQϸ\aL .28rcH+s8 Pw"B[k\݋+ 1f%F#j( Ve߸ hpӛb! z8BT[pTp&XOubۮ6ubc]:%fwfg!9\eVPcz'O[ҡ(A$yku0`% 9noO/xͪ:xJ![Y+WtdgO?{O?=v1XIU~2#Fʢ@R9],4הp"qAs g 6B'ѿ> *MYDg\MVI[=i[}qyh"zWLnkƴ K*^ sH覡 mkMXdnB$L7ɶ2\! S0kbZ)CcHIf5p^ScDf_uw4q#Y:bDY,KnK%\Ze^rerG+89kb(>d溵h:'>h<-9j"1QZ^Hz}j̽y66cxZ#+D /">>~U{`Y `?쐕{gS0Sof|\m 3C}[\is(;=My`̨۳kE Pm86 ʩ\F#&ֶN\-wf y!sŗ;NE!O*X6@*5҄I2\.9_e)sl#dRgLu |=s<*YWh!L$E=_1Ƭcw11/9+4,~Ȗybl~Ѻ5\eKܸM3zٞY.;,1W=sFh:RݟÚ3$;Cq*sA:Ƙd 4hODo~Գ;'G}wEI^۟w0a9F=ؚsAtvg Aq0)= J9 \*@gqa]tŦlku7p.Lif7 c ?L~ nBg@ȔUeF %4׎>lm$U9IR:gJBCVj00S 0.e)^M>P,܀4K#$,.DzK .,H & { &>RQ’J,MLwzPg5%Ҋ( IN="989A10=[y]+ՊvɩcàI?$@ّ׈+P煓KB'KfMi#p#A )SI8'ƫ%$ZHJ6 c-3qZv1 pf{#0AAۖSz^k+WpKŌs$U(,'IhnFȒ2ׂ2̙;h@T(}^wSH0s-8)ŘN2e  %~Qk#W -LZn]KnE( o( e sv}찁 fITWFơmhF֖ԗe(za5/Khc*!Tzyهۃj6 U9!K{ӆ t%١^S1nNtЪDTZJXqw׈p῵3`M" v= $kOÆ%vn C A[{AaE5(zw5]nA.\v?^s^swʷDZΐ>Ϳ7g g1Ck3onVnswY1_!rԀ.E:jexؾ'fՇaM{f=o9{R(j~w6q/(Tu^&֟u1l҉őֶyi;fD^\qs,>G7VMwg+̾Wz>:ߤ;*1]w$ӡcVp:CA`)#H_5!ql~S5)C."XA``iIX!X ¸6zD@m[%hgnν*06% KX(才Kvqɂ>L< %dw怵rzڐß}'[gmQCpw.iee1c%*ƽUCr-ֈ`Bi+≠ԛawVhPUvUW4~Nܟ9٪#Y 6d)؞: c7C߶O̽Y)b0[_߼ r ֩=|swK=ޠ˾?8 ]UkXJfDU܊⃦|]ZXp\ cDpMIB "|pv+-F1jN#w Rjq#PcoF1F5Xwѵ]x:(g |zeS#uMGŰ^390CykvKk`^fs(%RKL):S:IZB4Ƈ0㓯/*5+wܧe]YES~;Wu.|AA'FXt=h(D5ݱSTc(O!LwHd4IRSuhkKeVŘ&FSm  _?\d\M=YX|Zu5pH!!.Q2%s vb19hZ;C?3F'?a XO+A;XO+A3:K䴕]t%GDIZ\*@8dF5d1JU\Xu)eDm+ۉs1X Bimk\? JzIC$)$E4JN-4ߣNn^1v@AF $_4T -2DkS8KO9Eon1/z[*d\?Hr9?l*m`wwY1_!_w+ఱvxmΐQÜl9Li.cG\n|+~=t{=#DE?QQB3ɨJi)]39NLwX;`FFmќlA>HkG.Om>yS/hfQCZo&u4=s JDŽ)MF3޵6r#"eOv%/$A&u,@L2/-Kݭk2E.Y*h;eF۸3vGb GbX؋#.o:Zu cE $8n7boeM{Rޘf~뎝-P.^5_e!h߮&IS @͙ohw$̢L9So늳CxUFEͨ^6oG=rԃ[^ XlȤv/;\*֋\9[f/`fn#L uvbObptE$3_1Luo.fs0ڗ84H|4Noԃ{358&= ޿\v:=a{feL0H/7oףެ_?&g@_i~}qj~>kC<RW},J2aadV:BFYj%ͣ ^v3$ I0>[@3 IXPB vZ=*L谒z|\1\$زrμh3]y'W䐓Ly"N*썬(V(Ġc;L®ݶ|+K`@W&23׀ޚ a[. D PBI÷ߚ&bᕘcAO! My]Y@M98K=YL˜KWGt;]^*";<fq_fsß$gEEE噻!gtPx&ᬶ(\A 3S ,u9VX$/q>s+U6VHTU-kxW"emۮg1?Z9ì0XqMl}Tȗ];2aY5sF4 VZb$P*G3'j/tAD F]v2{P'gb,5r-6?n d>SkaEw?y"1>H+䁓Σ\JJ7*%"DU3UT ÌK;J O2Q^᭰9rŜw]5xc4֘Z",HpgPX1L)$ajAljә]Cp4C y[T0k^^x*e yp(ZyOhxYPZ @ĭiQ$+h11t>$H1d*u,Pd4£괔Z`hg' Ziy4ӭȤv>JfoZ# I;Bu}4h5 ̋JxC~ !+?7TưptIe(;OY'VH= acJkъRZUf*M<ߔ@?psÁQh \h'R d0LdMqS9" KaydMrhLV+Je@ X=#M 84U&#yMsM5ABU(SktG)n5\q$Sw^'j7 q[U N1hSHRؚv<Ѣڭ y"%SLX6W 2B*F}<N{ aMwȗrNbͿUU"5nXg$c|WdVֹ5 ̛c̛yTPcOC&uمV'I< MFj~?qRZZu`Z>iY4 VYi0Q8 G4y i"KPMti?~'R`sb]z#-z~@!hK!%еm$I7L'v}ߍ>x$_|$*S{\̟n)֔!WK1R,Gߟqonz-R$hՋ`XDfYtɣOr]dK춆(=O @>%^b9¤$ aUN`B k;7?$G7[9pkx Ġ r+Pnⓗ=DgqN9s{gc[%X24+Xx'0hĄ;٥@tT4b `M]:(c32Q֤߉ݘb:@k/|G`';$*ID@1 !b>xySFOL206\*tl3-׃JA8x\꾋Ba0@\RU2$J)n0Tye)ҒrD0]\BH=L)7N"bK1Bŭ[Z/CLwPq+5ŭhPia2YA<@&hR,hTkQsԮe@ %T &haPK(~XvXB%Va. g]֮pi{λ=4B*;mA4l};<9zFU2(8"4ӤFQ@2 m4. MStuRńUa"9[% W X r[z8 Hp%DQu+E z-YUYEEXI 5!, CA@jǀD똲+".*TSL/Y=JdxGZr\LMJ^ )j~$0e!^{.+#WṔ#Uֻ[TTя+Q-0BSu2eZ$\tJeۄD!!o\D+ɔ;ڍ\jcn\g95V}#WP!!o\DjR$u'z~{/Nr .keM$`(nNp (ח@.rA77PtC%<-D}V,X܅'%qm}dfղt~tin7u{p ߙBw?~zDfO7=NVK5d4ƃʼnxݻ3*L`9qLh= bFM\(Gf0rr+QS]3_ƚ鄻d.F婜ꐹ lgrR'J˜ s7/A(wǢ?7_w␰|d?vfy0Y]9џE}p! #\:d'k~LRWAiJCMP2 b#d.㖫)(#6n:_iq< M77 G&DHDrT+*ߟah.v/=/=DBL`M?2^Eh(wM, 1aA|͜BXI! ~Bȿ{CI؋e3h] qFpC_VMh7<<9lJU\5ȹzA,|@Sژ~{u(ů&`iC7?)TRj_Eg_ҋUX^3+E' .xX(xi0f؎.,`1ةx=Nבe^́Yg5Gzz c/qsد2kUq(xYE?n|zsfA'կvß$g2{e.]^f=1;ObںhB H;t 0)ƅ) :_mI͹/ NfpK~EɴqdGo-)Q=MIŧJ"YGH9lq;x듞}GH6'{u>tkBD?qR!HՌqw*>V}GI VHR{P7aLD1BKVZHShHq`,X8kp!j D%RD` JtI,A D`e)ebSd`>b@X&'KK n=Uv +mKo7r -g&꧑3iĆNy7p?t\lX~]o0puk&kd>,]{/`lv(U<} ΫH\,stcM+]6LH$a{t7F1ݳ%eR|0x:os\LE׏t(]^7q~&;?,`5uݧhף V#OnJ,}0wI"'E5 )ö^u)2K̘@=b ճ+sǴ&%^{Mh뒽DZθfEZ%3Eo?bT]B=ܯuG 섃‚uyƝUPa`bUX,FS@4C]U %tYe*hGK[ 8uvh*iu6/$}BR+k߹ ҈@k\K ALq#-.6O  R$ mj֖Z@8E{6!Hr+*E+9Fj{^ZAfr=Q,LCƆ$vHl 'thՄ+mwp_))#TRR8WmAIa)1qq#Ju]FP82L )* 팸aCQp P 4mPJJD0AR9P R(&2HvQZa[R *kuD*&X4Ή^A4@xUС΁dw&/ }v5\I$Ԕ>03bR8U8瓄Y*MXȟDl 8nߟѻbb:Ϩ86n{ޭ hMՍW"`6 A#0DBJ{<F QGX B^gT-x8㗓G&,OnmEߧ؞%E:`sGVBZRYe1w3z 9JUVi 48vzr۾I ik\*ɶMRnՋ#Y%jws@s+Z[Ne{F4J74gM)H{S-)׭]&_1*rUQog&;ЇNEk!d1sKn-j"27}blE|Ħ.Vx|ٝTP!4͐`~Wc|SµSv8crb2&Ɠ;]kϑ\c=5TzXƐM0@2MĐI0)X.r!./*D QwZ˨ R:=ׇ@|}~{ RH3.D_[CA@_qwCn:>LK.rt95#OSr1XP&uFNP*M4w"%q~ gy,& lLIH6_Cb.6> ⬔{MQ_^ us֑JܴgO(eo3}̹ _J6 ApUqM~n<0Q)RP|H C!S""g<ܾzY 6Tq20}^2T@Y[XtO~;zkqֳ0Iy_-T]gRvFЋD²b&m$b@5UU&[ZX$9LU :4YιD&3e1ZX"70PNX,]cxnt~vx;("nQa,VH݋ Hjaz \1`f 2l(v؇6]*):6݇ը3W~lr:̦ٿ]7xgjxwu݌G7fެ}y&D:bPyy~ D⼼%5FagR& B$Y οRE0jDJ#/_ 0)/ 0K) NLJX1kiH!Ϛۄ'$60FVY41^T%H+AI]G/7*n 3fJ(OU‚H8bGĤDBY/>o99(6 >\т3>ns|}*`ň~?3W?_k4&\0/ؿ_.|ra36_3Vkk&kx>,]{/`l{@<} ׾=ЫU@. n sVԃ][w`$9 f  p"ʘj0; n gTpr`0 ׄ; /Nxd7J_wWȹ˹SߪUŽ*b:-A [|q@9ѵaMTsvR./WEMP((&.fBV&lW}[P6ևwW7b˴zډ!AIF E0M-yfk1|9^.gybt`_7" vVpNn|g7-|K?5~?#~]{gn90z<~5j:=h3% mFC}~K"YV} }pibu#8pq%3117+ E{ᰡ8hlQX"ilv0ZK`yNzGpǗ~ '0Yf48 NRtM\( Q~( &GKb2Υ8k),9Iߪ+HH "LkŤ}}d\PXoDV$!T _R|؅->+#+u?R;1nTetp<8d>U8AUq` R b k)nVt"y6brfr93z2nQ_%BG&%2NI]hӡJ TIJУбc㪉x'ʵ%~fUeyl+tGh4}0WGPe( )ߙƞin5mKts.$İ?1hnjͲ4s@=2}..FgBm]qyqe;L Qji|=f//GYƱw ]N)FO.dǣ4ˉ~_\z0jxoD($č$T7 $s.ތ (QDK槗Q Y1g,RH&356% L LHgH  *Ii")TJ!.8p0oc2l03"eB$}= gd*TOa޳&_cȹ-ȖK(%B".6R 9TI{kT!q6p3J>MVQ52/7ukR+A>7A‡Cq/aBoآDД&CX2K j R*aDE֯c"gT&Z1@%P#j|gkL T~حtf/Ovӵv7s$U/s8T3>x(]D\~=^og 탼Cz :탾۲}869ogr@u_1~+,GxaL-}1?.ek [ZE5.{K|ۮ6 n[m& ,i޹*KxI[`V*ǽUOs|Hн )@UXEܛntI7O~I[70ٺN-Keѽf}II^6$ QJjbR֝XEi N;"(ZƝ+Ph򊊄f/ۜW|*gx8lڗ'X,ZMJ?9\Msہ:˷\gAg.lwwEӼC]4+脟̾浂))N)Y{VH*E K رWf߃{?Νӥ[ +v 9$#p'>*+BV$l[|Y:~6?#)UIj9^ (!)UAתzZoԞC*OkU@-vףRek˟& '-F_ΙM-VSv@'"@ Å!! b`/>-w9ف#Z%EIm(Rf5%կm *VWDž.{HwJI ¾EPpSq9 uJ vZA(,w>3;PpIwh%(EvE$  :+w9` ؇Njw][o9+_Y=@3l72d֖XUjU T.S乴j1 QyDZ z` zGc)^BhU(Z7!vqH!E֔qźi!!苙^bLP1>.nbc,ef{TAc]sWFoemL%8gͿW yQ]]w/bU8Q%naG;2Q R el9Ҵەy{7u}l՟enZt6ATd"I2cY:^!ԫ~MokyP5U].tY}8T,#U'2 {EQ`fH}P>i7.hnD{ * lNgW%PMYV. caWf-~⛊iOA2Ju|({=솵|/jBjj+̦j0#}*pXIr<*` {ﯔ;0P!&QiP }n{ .[wFA0y+I,zPc{E~2*c*=ߑ:KO.G#P H큠'ѓ6UBm_;MGP#2]jԼۓ.biFrf9fΚ 3^T],&[ð(ۨ} .BZ~2\sgs} o |}Eޡ|D/g7ֽwj^3[h2_uq^9z]ا0}%AP^#I*Ds #FZpVAQħa;ɇyj%1~tw2fS8棅o1']m7'7UWPՔC-WSJdWM);)e<<4P[*5ՙuzKٝ: O+dN?owOf/KDd,'zP(SW=Mo _&D)%/oMRB`)Ej<#"WLUd]NiY4)!MfYIwMgMDwGL~ɶ]O 9r_rܟt͒>He}; [f'NmɶpatcC?F+Q*sP?TXYRG)ƋA69ʠ=cu>&r)j;aR3%svŃEGh ڀZ^WfvL}p( Ju$:{[ CBW [0RKb;c)@pJ Q4Z5 l:њCT =K)N5x3]i G ^s+cM0I;G?@q=@k_˸Sw$Bq7qSGXi Bvqy4Ѯ<$Aeg7³Vg;v!"Ac`r8옶[I%T3D=Fx6 JR~XXAaVq!,*%@pRZWTQ[@WRK/s-6/' D-v}4.⻧rzxsTIf6>Veۗ7ύ1zl_Yhp ۝tśWn6(fÙ^[8x캊fw7_s-Aqj8{B{[/F"~% 9smSPmuc[S RT9֭dBuf?hukCC\E)Ed=py!fjyOwNnSVN;w,S~WvWF %I1s]߆$ w%5~蚫X18"u~X=NxיV~d<_D;.{c8(+_7﷟1˳{Zã ¶ӺD<aw_]Nz..1.'{f *>Y/7KJÝ2Eʵ:pm}@^b#&[\ם ofج6eD ';ux"a5DV]kB%Et+Hu~r(F 1>Vp([4%@)ED`C쩤b \/׈z#^cu=^: g:HĨcA]4A##X7Q릃#^r<JoJ,`NZDV1 TyH#IYluy#"=Y])aVW ;ߛ@JKnpS=S0͐RwG~RZJXK @& &Qa~)Ԃ0},K])A#+i^a[XohbzI cUMVT` ^.H/{Mp^ωrn#V+T3$.c8PVKܔITu塉C/lNqXs ã5yxz8wԝ兓)cX;norh'jW˱&0ҘlFw5Lc~+<1&GEP]m:Gj K Km 3B$杅p-Z~FK:ѣ:iTU0ʮƢ.[E=7j<=HfxnCNq+Tw!;f!.iˈeKũ! 1Z aT e`DH1뎵crxRT!:099\9ۄH̊v#3H@Uoduvl0upIx}΋L{\w(cQp(<7s7im 00>4_I^h<&/;#f}'L{G8rK! pPzK9ԥ z &g\ !NsFii7%BJ-u i!qŨ !$.0$~>(^ $E$jMqvNPZRİ9ZRY;j+Yy.C IӜYI-m>N`lnA>B9{!<}"!3(`AgRi<-=-GuJ#v6O/=6 lHܤ?~ơXRgUس]];oj2f3l%">T.7}M0t󺿪罖b_?.Ow۳ZusQ={ՊCA݇׷a;B{K?GA(z:2Z#~Zg ~-#Dpޝ'kP/#O-F.i뗻˹7"L'C|x u_k{d[/]Kϊ0mOOlSKks{C#p` (D> nqݗRB^qr>[.o >&} %NI4 5*Hl};152.C?0*Ѱ]>_|wǰ&Ϛws"mp8]rʋ/Ws%"%i%pvM=QuCD)m L|SjYTY]чKկV_z٨dBwL%vK5n( CjE9~=NMC$m{HsegqeVGef 0[MPTrQ«2WԔ)dCX <%ύ- 8/ҕW.U*DŠ>zyr;Za"5S&( cj ɕI6kTt"w>ſanCKXN!~>=raƲzG'VxC %n;:[.8ЙfJp;GFKϳs4'[b?o 4xn>2~Eщwѕc ڒ+tܗ>Z(jn  TLV%$rv;KŞ7.1"i=o^ƥ$N(RF@}y0I C<[Z) }{料BpLzy) aj[ }\m[h e`/RYÁWW/L Rߓe(5P '.=FHK.r1%RS.qrt%yyLwO*&\*x=T*lÊ}5By#%{۳^+aQ/E4O,(5ܛ?at&*UrQr>C#J72)܄sM?7y@gVY(gc4/4IBI4ƚ AP+M~ |(&]O>Jlw]O6~Z5BFlufEl̀=PF+fbmJYN:Yr -u^ЂS**0: z7JGaJk4jI f:m,F6 oFŮ+rI6EkD; Ub+H%~ ug<F>힜kPkA- % y3)h%鍕\pwu>/,+ԅ,% ܺ]ges-`dD˷`2M FB& %MVnp\UDK3U}M052"8Z-BDTr\Qr8=UtߨE?[ **m&{(N-$cT rg .ieiblMv:2N|؜ SFpWpNɂ&ieWo[@`Wݍgn9\ &NJNT]H7a'*' f|$CX-rkU1뫫֓T~Q\ZJ5E=1ZT0Gvj,'H0 8 Vq QrYWqpӶz*qjSe8ç o_Rg_+0Ɩa]4,7%FQ`4rWXcE^0>gq߀zdZZ׸|rR:d+}8넥7p.7 Xuݓ$y '`]ݤa >OİiԷ=m{)ӟ$[>78 2`z؋yV{Vx%^yֱTxK1z-1*Iއ%F[ρbwB&ڪMF7 0ѭ-1M;RF9=lZF9[-;oڔdqn)9W_Kݺh!ymئֱaDK`w2 ?M-z9uȦ&9&tXwyӲ 2so00_n>8 9vu㻫/vq5ͫZ9͝@"%M>N8]˸pcR#ho%?B\6=BRå51:nMT܅MTԇcSft¨&f{֗jpz_6%7Ei;XP̘8~/'~bp>OZ&\'{g%cfW]BiNKag<^8Vl5J1eh}xDŽF6vg\ LDUv=&)y> "U_ )dI׮釳^Kb_?zl_8=<$b6/jpfy68܉#᜛QHt(Rݝ}mK 8!?ʖ1yS+ڛGV^R2qb&q>L̗hc[OBJ?ۇuѽ klĻ3}Ә/m앻NX-N[ؕ};Vؐn6aFQT0S1[sJ{;wYPՇ >ZnvZnj]R[.q-Ei?njS28` kO$NC.ňfPi$4u$8k5Y.Yk-cdy̬@.ii&{b2tvE싢89I:vo82@vBhϵ/3YdqHwS!IQMĹ7.UjJR?8?a`^Ι&ت5?O# 7qt;6N bkΛlip9Kx^8@b^jMV j E]oiܫk,[EIMo NLLȉs=ˆfL/nD9Pp;jJQyQ*De^&r!J Tԡ#/sAXZwF1й02/6⒬KITG)0*t.TPN".ZXD)A L$Sˌغhaf,W#SL &%ױ*R\%B'0"%t:=%DSϳ{o.3褬ELW[v0Kr39pĐ\/x.MY?F8%W*H42'/H@J2dlwݫU-ڂWsp`"[!k[xsXYx]2 K,h <1` %(H Ɖ*1[IHJ}~6c xJI鼔%%)0V3<5P9:\Q$g8fY(-W犑?/ @y+01+* %)3b4}Mҗҳ_+wV8Ljd|0y3$r)Bpcן~-5-Xyĵ#&0 OdK󆣭[Dj"#'[:/]ŀռҧaij^1u1a%-;Xtj8WR= -ֻvνE`DLOt eax%?NANv֟bjZNACm{C٭: 5&Gs="!D59U${%9$U Y5RF fqzZ1>N!&Y9\dEY<ㅍuntmG4Ǝםx_ϛw+";+Z5|ĂdQ<9Ncj[u|O +\vBF+u%8je7/S..#f*gV2DA*i4jB *eF w?H`ʉLk[۳ZרR*X.Vc[vχGȞ&\,_f/WD^WO&4ЌoB37>g<(uYrKփ4 CR+ ~[cE5za.;ړ/CW3##WofU^v2+ZyV&.E"(r=xBbZ " V{JU Sq빦[MwEud5&ZĀڶa0JhDŽp7 u\JT aIR …cJP B$*BTTϓSsl W"1Ay$'$M=P2Ydch4bɍr8NzkŞj<gp$/V) ɐ#(W8҃ "}52Ma?LU|g4C,9$'3:KB8)K%^A`qj*)H+Jlч*HqVHɏosz7 yog8S~W=.|ׅJU#7gkݭס]`8ąH@@Gʄ Njgف>~yXx3S);ǻۄu|XsV~ -;O5kDjwSq__]^ɬrGG p2%a?=` sa.yrYoIo0<5~kٻ6n-WX2{MaX\[o*UexJжƔȐLФ&Mt7)Iő&✋P]>8Q-F=ջ>x:lewpxЎVM?9hH5Skk=V.ןuoMN.yz ?LMpt>l !9^s^M6YB>MքP-x4I`yfQg^RV3jjy !ʏZ\N(܎Y dyío ntEo~?:ȒS~rjr1,ʳ拥^X#^QfCdliPaGM/{5Tnjnh6!Ej9]_ϣ[ (ϋPi+?NsV/7*5T(K|:}7^n_{<{O*TE|rӏ #FT b\'u^ʵmGtք.S ˓ E7\sq薊A뤾Nt;ޤ'p ~yRdtkC~p]zF{w5_Uqwz5t:xݸ7??uŃmT3)jg^/:X"ͣ/^5um!{pRg6P.-kZR(j'<7 JpF:IedǸΜDR,E!P. ?VN QC7HZ0 FжBINjuv+ C$v%@ P㤠:Ab9ј;ᴦ\z9A,y# FR簭HM +OætaX:L}?%Pj%n5&dakK1hGޫ%{ fyErVs*"jK.׭VݜE*1﨣lq!p 3e1PEhO:ף@SnĸN;D#<ඣ[z`dtkC~pM)N~89#FT b\'ugSޝbք.S,ҧXTiU]d1%lT%#/U@BqH#*3)v}]MKe7|,cfٟ0tOnvVF%+O/rŅDyf}6}n6oQf L )ߴ!mfƽ'Jufõr⊻if"Ge%sIe^ذ],]+E!R* ipƌVJkGYK^Myf Ʌ(x5)HX!(R gNN`l%!&-( ϯ[Ma>jk4'3O,\:Ex3餱9N LVv,a|%`5pT5$\.tU(fsAf<̻i.<%eˆB$MsL.8)Tߒ 2\\ )x^PKVcDOU>>8[xey2\eWaˍT;,H?_akVCddydD~jsRC#t1G:|Mӟ4^2"95|`#0`}$OJUDF1"Uh\$8rvu?Ka0!8^HD\)M(e$- .EFE_.zb}dz= S8SJ}Doαr#*Ag7/P..JVoâ{oz</}'Cln6?-|5O!6Eȏ)zp~ Xٗ9x:˿}m8B#5[IyFo Bь^MBr_nbz(w%]bK?0Fn)OtEa#4ϑb;} M t,=x/YMs~[EzxB+ ^% C*P uηE c ͋&؏+j)@s! 嚃Q훑pN0&0qY+N#B1-%!` 8`x|@̂([k$'(V3g5De՜s{q "'} t^ ӣj+^EJRiJ0, xn/Y-<{RB⼔C3RB⼴S|ҧy)EC3KM egd{Vi4 pDXB\j!8ɩ{O8Vn0.-5Յ@fμ^l:̜d֍·'c}_-5E!W6ۿőTVDZ:Pz0KQS4([og<^"k-S)RM%5ȗ}#Y]p!?uffq~)ϰh S}TkN{Z68&z2&9tuxX?P-ԢшWMFEum%F [Yi.85K0ўn+':nyqqJ<2;s˘F؍\ТdL6.,[Mp0t%SIwZ%6ǧe$R*;$*lr9k܏קVtK#d}a'Utc{Ah冺#&2r;Bm`eIJY'ӾX}%P%FG K':S%So.\B:k&m7y2W_<ݳwΤA}A%Tys_K2q?}RV@|jʏy}ތ3 ~?+'قw–ݼL R@r t՝1Aw70d7Z֗*F_7nl>Iv-flq"Jn1[dto+G]Zjx[2mxDz>`#} %-8CscFi=02g#C~G,o%Z>ʹdN SAq"uف177֬<F<  2 )i] ewW>&*iMߕ es{:yj3Uʐ btwcJh7|@Uf Gr$xF1~lǔ4tPsPIɯQ~dxFY)r*j.w)J-iZ wKhHҙ6cL~LJ %lk5*rԀ)VU,JQ~JXS~NͰآ\ɞK˚s),"}@?! ;ci$-x5zY?S3r< qs꧈=kOWGݩUO/2 ̨cUf4B\Q?:C#z ^;ښǓUYd4m=Y Sq41k!Jc`BҌo:RSmb*J\v.i[y64{6]]ڥTWR1R go;DV[v=s7q!Y)Lo?{O֑_K]X\yY;XRۘ"&dYBx&~"F:+X2LkO-QZZs_?FJ B ]% Re#UXUiGcZB}l [ qDa#K-\r)EkquW ]T >BEcM.DydkqG#|T{{sGU.%a"rR ZWvZU;6k}K5#I4Pͩ)Z'gӹ )뫺2\[OFxSSwES'_"#ޕPhgu&1|G>>9w?k 4|s)p ne{@󨢯n/ƫtw/A7 =KTA & -AWk?D7ӭy,X~sIp]}m1(aіwMY=5hVL{bIpax4Opi◘P=:shNS/PHzrm]06{zlL] Ɛ]̦ &i F=0x%G 2$'JT rLKJ 8k6^CKʐt9{4<FL 汭Y^2GqDYcVZjӈ:2jiڑ$M-IY!Hqbķ`i<7l%-0+"{f$V8b"ЀpR0sʤ&,Ŏ291ќBޜ`j॒Ĕ$Ԟ{˔AfxkY5 2,%&;#GZJRREaJœ9kSYze-!IIxUyj OF4 @'EW n)=7>y? s*^|[p|jFn~j&˖ϋS# n{zޛb/$ @ѝ,%NѥC'~_ gbXAn]W}xZܶN[_W#„`NKMokǶ"Be!r$ ry(P{W1x]YmcEx`{| mU|=+vQRʢs9JzQRj Ԡ[2(9g'nMf[ԚZi300fx,4 RU ןE%txv? d|*ȯQ)E;e> -y/h1=.:k!L2%C5b2OhI$^ /+Xa{y CO"Qi= *㙬xH/Ya0#1j@XwRQT$(NZE<;δN%μCHEAk  EC%]B xtz:IHHFQx+%KKh)d_sC4!).zcB9bCB:JHH"Q($P)Z帷#DjEgQz26P?w6":PG.A[xj]85(@XVX@-eG-RC\)Qb3 KSi.=vҢY T"`Kʢgi {@} 1H8F`KfP~n_mr%;ԚR;Z("hdr!LF<:##d%j\PG*Ǔ8z/p$$pNYA8t4C(܌lꥊ؍Qu 37DLVA&~:$f XVi9Tvd& XT*7LSh" k #ѿ@-V Ky{>^PB"MIa@0Cm]G $TȨ}#ܐ&QB%RtCIR ԼP~}:͌sS=9 )V&5TT ~ 5dz.=q)- 'm_{v+Oq5Y>4;3|x*0Pl٧z٭-I`4%tI(+܆$<cd4y&x/]%j. { 5 ɧw[:㧄n ]N9~~_6AR4 dTϾƲ`fek5-iҬ0iQTB-8fF}$d;2p7=^JtuCaZWPVpi PpTU|Hoes;k9ω4(w58&>j\>w-_o>I$.ݽ]jWڎaYR8ѝR_)W%q K =b@lhӔXRx)2j"fKހZ2`G,'-x$]mEYFcl&ijL]f]?MjP_i9JٖiIPǷo_h~WqL~rkaO]xMPc`̒Ck{c/+3{7C5ɿyov=8Q|soz9f.~z뇈5/oNB: C]@zO ՏqHNG#ۨP?ԎIl-;; Xb%6h}BA2 .'&gpWo̒15GdbsJ]lj7@L5+&1]msOh8EP!Oh>,n^GwZ+c\śO}Ibx?G"ݕ9 {PzO3؆ SrWf:vWMƟqE;JkRDw3NK>Nn+?!!/|r=0+^h#1 $ >D5Q()䤆qr5,u#|Fc=_߿,ĜV }ECI9(UtGl;ێpT]ql(Z o_6g&uNe@̮<(K0tx6"7=YX,̈#4^R܈Iå@ifJoLb蔴:O56B4MITy}&!7B4OSV 6II4! MV@0FFK bdBqlC[OM!|rշP~nX6.U6x{W,#16W)O>Ewa"j?szuƓlMAEa\ X~͟6|KnB_]\/g׃|7(T֓ BZYp2V׃7!ǟ]PwÚ@TT 㷠P ,eθOd\KWmlMmOQ=rܐBŠ\Rɽv6E=yW/GQQr85JHZPSp%Ի-v0x86+/KJ@.}+t}c~ŨÅΐ폜=hϩ{+F=I.v8v_#+F.Ũuoeb9PJ=+U}Q$"#nbqnbV|2xBV+P;Hjn2Y4 FT+; v>;nc0+2xru{SVϪ_K[>93-6;lF:7MBLiZvb7*ﴙ]yQޑ>AB @R<|s(Oʓ*8A] a֐x0%aW,NRc"Ob^Tv5Saq,wʛn,fS.Ӓh+UM^5ze'kH}[7\`Be\Fg 9Ά%L=SRQѻy >twVumȁxV%QIfh,*WxQ#P691aoZbU4 \x  @SõXA6I-2qȴUNk)` GBn4UL3;i{&]Ld216MN9#jA&5&B%!CVЁiؖԁuu`EEߓ%rDt_v$Ѿ }';6 '5#9VRy= U$a[j+N U'8]XFi7nLU^QL’ (݋%#\e;*_SL"X&HVtLEͥGc,⪻oz5oj0IMQG .VJ8Ljתב g{^$I<S" 9LVȔ=1!۞GrR1P2J+9&xVg$p[B(T]p+x8B"FPy>;kIޣHʳ(?{5)ؖx֌Wfʪȍf&Ԝ<'{IeM l>2x|s$ Ig /K1%g%J$ۓ]ā7++ Ѣ:q}&'->"}F>}PU 3|cc&SyU\C_nc.:圚.#;к}}~aX+y=.yO7es`_=p`{ CH@ھq=S( i!g ħkx_lP~A?MX2N7ˬ FٽrКC;d9YÙY\RG%\"9SI2J@r8/J)g\_Y@HdY8-fr |34tg RMRnv Lt}P4ТeE#gHen IX\Y$XspPh9eKWp%N5=liTدDڱ#t6KWҐFFhc/e'K9_oP\;~bԡcu\)UqsC;Cx>Tǘ2%_8K8iZؓV׺4K+A'7=&u8-uH_-Atn`GJ0r_g<laP|*޸&kH{Ɋ^8Ijjz[ >/MpHRQGWMvuU)Q+[.'ye L9A*' D^);Ů Ȉ%nz 6. Ing\e Zt'P:sNXv4H񊭥g(BGT 6GI#.\65#̡<ͣ|=o# Ѳ2Rr׌0Vh1Y ^smיXYLFޛaRy=I5V<-Rڢc=V iT9<*Tں(6)J_u*WmvJE.UZ )F/_zk'LE:yJb7HmP'J@aRe:O5#),U*O#6a?|?鬚7.Iׅ1ށry[<.c"`NAU3mםB WuE:|\ o!}ޗ MFAMAIVzC҇ *XT#+!;\jk:VB̆ǰ 9hrY&ƃ+/@x}V?D7H-ʍgxYT)- ֹ@A,V MQb RO@cIGQbtH0ST6KlSAq.51{9IxA4&M;VDB`Ks2\Pgq tKePdpHHDRa%IZ+ |D^!NHQCs)Ч{$&! Uc -gmK]y0ƀ$ T j(}^:+F|73g}|>wu;gފ1 }fGAy$h \,Dw0iM=xOb9ʅzG]5Ggl-PywF*x8/٨,d(7.g׿WM }Wz$C¿r+=.au"ӲFGS̸AGHɼW`ĵ |n"wR1\T_4m/wC5p>:ܗJEw.B__mr;\B]9~.ѐ)"7CXR#Oc$7]6I^%ay=Ɨo-skmTg_E<\'*+{u껬֬:|TVt?֬\vRX,.G^gϪnތ;/mVP=\Gb;i;uu!VyZ9HK fN#Fx1ێ25 jMoVtZV Ժ?>RxBokͨ$VJK<2|\ZwdVދ!$Ibs}q?v}w7?Jp~(aw#]Qd & Ő^Q\+gCu'x!k_e>3Cyuxc};xQHIXKkU .gݘJ348>zft=[ EֲL!k_O^[맘$caw6 ZPNmdt-o~66urÍ$shxUDRҮ4BXYƫmI FR čqk]: j 1#n$\W52rcAY£<09m-)b|{=SkQ-9Amk9 gr7[}<F~z݂^õ [1fxz&ns `Zsw֧r[Vl]okkBTm{ւ,9e'YnrJ\`SufXXP4j@؅+cf!( nx7 AXI%e͈.DJ~GN{h9@`Jo2Izᬌ>m}Oj}FWC]~$/ߊJp1ޗ6kޚVPsFiERiHuc"_(j?`@{$o@e9, " @=?ImXF bax~ 9 ~!-#rj-YUu`F8Jm4+<5wz):et<O\|js`8ʻӉM-JYάA١ u퇚S<[nw+UAgr9[z ?wW7(Ę [Qp$b[گoau<ªO8Jv+BriTiwQ89Gl4c.Gɣ7(k-PyfAo` &sJբMS_&^)&*34?J~Gka{":)愚6e"}5K`*t7c[*juB4]o8Wbaѝ[,l~EC۸Nfeّl)&b1Y_,׷GBHՍvǞT֍^Eh$F(B┓yFJH"k'B)% 3qX*iAԼ._mXj2F'4݊JJdyn7 U:a CKtF13z@$14r>\~i< .#G0Ğ %A~E] S(%-!d,  BDJPN,S gY.z=BUa5 ƤM5)E!YGsҧRY4O3Jd}*ZG9bn.[3`T붲ŊBEN%Sj7WsD\pP#6Pi]łmt;݋k!7dqtI C{bK~}g)i S Pބ#ևU՛^"hT~unFe_>#_Ҩ(ewXVt^-F w9FԼj"Q9P;DeW\hMeFWڑPkf}c$҇Qꙁ_,DZ3( I(>n獟#P7GJ6Wq%u %E[r@ W\;sH2*DKNu,LʒTWE0pŖ 'R}l.-2F: |Tx^$*p''nɜ{Al96&Bxg6hɵ/Yo;}: T?2lڬiYLj]7w_7\-݃bQ@2Yg 2`șJ6%$v*2aN_xw];.o}P#tNƚo[yoXq{3HHMhYړ'P9LSI690Pt)/1:(W=BEݨ֦TW#QR M %*HQ0ehJ!qNRi(|)\I57EBWXjK"YZ AQPLj0- LP-ՂϑZ 󂝌}{󂻉@Ԃ/۴ڟ4]%ohB9G ց7!5g۰՘n.>XS=x!A(aǖQ%]G(uPx"υ"Dql>&V1\2gQ%zxAE?JOWfaҧxRtc@Jb=侨*O^!3I>lN~m|5Z? c>ryHQW_/fiQ^zTʨP?vʷ}Gq>JMx+`#ݔhO{zD6_[amy<8-tf7Y|QAWs)ģ\׉ZEXU(8GacT8y+ǘI6n-Ql䍲s/@3R:DІ&FA12=htpQNW`eAwZ'_Ś?IexuDFw{T*Q}t`SNG}Mp,]4?[W$`caA㋱,g˛2o?<,rsjzHCo (%^ow^pګhO86RԋBFUDU z 0rR~( S AF[BUKO9Q,ëtXw 99ElF-u*_U_x P^euT{|.* wU[,oŲVOmOV-Pzs׋HСñPs! Iq;?~<՚:]h~ [U#PmeezNPG蓕fݮ?Y,`xUq\o[U}>VY%?b˲"E1j,{Ь#A4 F,/_mϽYyjϚQ^YqsiwW"f>0+ng7Yef&O俻xWnҥ(Eve6^ۻŃ} U}L=ŗv+ 5v"U)U̕ٵ2C:Pk1M\TmHK_!5Ӿ4ԘGgY>WJQ@?$+N@c}/nQqq Y|,!@DD7rQ`V4w_{ )9մjH3t*}ŘScI.'I2(Lӡs3y`JC?e7^;Rzf()6BCӨizDN"78S=mDGpOL70*wMO r|1-Ѓ2pJ`N;1(ȳ'3~NwRRxqy(pձ֊ph7S\e%.7$I"1iA ed.yY^e4LBfeZjh=Ѥũl8^8)h8M5"ȸ*vI%,9dF#(E# ' MHm4I$%aFiJ61 u~}H R )qZ!Kf}bN5UVc4?tҝx^ʬF5=!'pZzDuQnkc$WW&;U4X\\U40fL7%9Rh'm_g $8t v]{a 6] .ҡ[)}ؑ\ cj;8+$;+1 ZۅƞcVx ) oIE¯Gv{vZG?焽!N;2My.Ǯ9'13,o{ o]R%9)_.)b';9c ~JoxXC6Q4ʑNv;\ka죸 VJ'KkszMx"г;;ΨJBI \jbCG15P`Iwjl Ue=m4oӐJy$3)%]->WjRϕZ| &)SJ4Q@@%Tsnree^ɴj Si r?*~>mޓ{?4*Rg}?~Veޯfbs^UkV_|sjjm:[?> (Б U pC=دr]Ƒy*0\T2@gL˴UnI gK]İ ϟu˶#[e>'vXz5gS4Ae,EEe>NK#psK}jx ($QSJrR ]u*}Memz)X5RDs?{ܸo{j%ָLS[J*cmd[c>l/@umդ2"k4F7}>,A"LQWO6nbrm nѽ3)suӭ/O?~-?M G;zxZΟ7WKg|.Oט r')z(hieZ{gH$S?-|~rqIZ᭮(X( ^\dǨYߜJKtxaܮ13Z9 Z͙O^u'x9&V\AM X 5,[pȋ僇Ug<]͗ϛMd)Ѿ7홗!!$^ݯȁ4丠X!GrWc l R oa3ZK.xA eA,rkr)h°/UBY710 CP8ArD %Y)&c !V,=.d-UO^xVB׭5*_F ^8h+"@ diەJa"„Wr+=T8&uzaƔ*A3i3k9P/[R>ɀ[LgZ/7ƺ(ahP3ܑ [%xoeΰ {*u\k.Ū֫8aҪNPbU7Z-(顶XC~zީ<\Q[4$"4.wgrȰ(|]zpns/-)-x9v15{b18Q?NQ NZCY Ӧ[3$J{|y t?eFBiK$7份Sҩ6\?!~=.ƼHFkA.-zgny"K5D?Cq6]ҹޥDZO4K>~ :C肷q:)(SXzӐN;Ŋ;EMcz= :VM lݻ~x.B  rΦqD}f;鸬 oB {2~w神-zl(XGD^ޭ"o6⾒hj}郩ownM~gb1r=vp&;h$T͠kڛ[9Feu~)=-ykC&!WPĻu#:3ve=>Ue.X[0IDž`.۳yIW;Py`# 8xHc8'iz/i~ţ]ܖDbeoNAoS={?}s?~MM ƌ't,48 r5hM}]\Q+Ŋ )ZXkNti xflC ,9(dҙa`8 \.]#-۝LHX DZ4yo >B)},: Xx*:4\M0!D!Ł:}zB$yQL!Z{B>`7pʢdDwMwdQ=n`hLנrLjach>b"@hcTዿ9mo|x Z*dcÔ[ K re 2sҝ҇!/ض{'e"]OM.Ȓ( ˩]E[W^K3JؘcCgÐw{dbeEm;g.,IJ^cc V*4 y$3oI`wS!h%?Lm]7Td\㬰LviVpb(Q4Z(\+8H3jTIz8?Jyt8cGAНJ _ zҥ7c"^\Ml&2+@p!9Ef\ X5J:q$y.*r]KΉ!Bt&T)+R+mGa52Ƹ/Gb%wYb2R6q}劽@P r07Z°\Jc V !( &ԡ煑+A֑G-}N42@)߇ƪ ww_1$w@ >4牉!J"A"ϓ+ >{* & e+T3.U‚ŁLj*amW をNc`dc*/A\^XI}5o0$>~TJ= F<]p#pk:CQ-a2pB/+yܲZ%3` Ta(,Es]+3`ViidSk0/jE_M-~5UlC̟xmXIvJ 71-vG_a`G5ӽ@FKQ r7mĀOdV6C q8RhӒo z}F ( Z8 J5k֙6D F:4˕in,/FHP+n@sU S\m${-&X1  66E S \ȸ356 Er6y p<[nS+TΧ=B619|͑~wUvaRC NԲ} eyJZMu{SW9Qz܅uv:m#_Vأ2],}5 K>0HcuD9[T$эw@Pw)0H;cpLj IbHa ժRPCŐ֯>VuħJE6WCֽTcȺ:2r],nwcgsQ&Z/^ӫqS&^w3S@T: ,@8" s DɷnN,Bv c!\5Q.dZ vtRU@Nlko\?2f Dݼq?>o:AH ^gR(Yŏ^|V?VV¸Y^؛OVͯ`-/_YAP~uiO]9khNryaVu!9vl"rsL͆JzVa?Μb5*+YKE4EH-m ߮v Ab":ng4fA]<ӵvKڐ.)2%ŹAhT BD'1휻BY[U@ֆ|pm,S ;g,U'Rq/Nͬ4=wVg1ƵD-nfWoVf!RR_dM:"Rz}R b-e;3AJCɅ=*1C!MVrLNs901F(c38Z SHi-xZcȳOj@-]~[ޅQt.%@ heϣIPRu{[ˍblh d?Z𡤐<0RYMً9M: P(νI0v"Cj3 K2*(R™f0 q,c#]ք`c@ q(r=: >Lfrv'3ɦWS&1n4#ùB"M "),#6ϸ9"VY lA*pomQ %LI9QH38H"` UR( Sh^0„'qY%R@wZ#!mJ٘593i3-M5Ͱ)T\sj|!L"Y#qXS{Iy%M e)&-ȤUFf~ZbB1’J{=G p`RފIp& ~G!H*%{*0іA7T0 }$ MDuMư.dpo)ÂG垎#-gX#`"^^_qce ;hI6Npk_H.jIbvvn̗3Qlg8N"{9@hUzy Rr5)oX[q$\ Zn7,QEfCycaq =f_b2AE|Z"pz# <0yQ<䨈@wh= C{mk$0%F2) &b$X͞?GؼK~Ո*E Kiks=;5OPbr?1 (η>8N%HP!!\Dd~|+vsڞ!hT BD'1|֗֬[z(ZvkCB>knxaefJ|GT"+'K=< "%SO@aIPŽ% ya P]MYSs[W#=kz"Xmݵo5׮s2r_\zziysw7~].矮1O*f>ճG-JfO K?!v*я^^\0egٲr>(ckz/,Q VyVvdE[ݎ$c%M1f$2A8E&rTDY)䊊\)VVHMԘ0XӟFB PKW穛'6ߍs;=|iƯ-Ln|_ xO*m)5rgwawm_Ol,q{}kF'LÚořbO )6}Qs2}o2 Wg޵m$"˞d#~9l8'F -#x tHyf 8DW3 {3 9oMv*?\9b4r~ O I=Ω=%L+ !'g]؛I5y`D(dmEt5OqR0eNmEI6R sI*F5~ o@Jj25g5aH#!GGCIhB5c " {qCȌ>ڗÓ33Io:]wtuv%?*~kXQy 1Hj tThT,PXsUC8b;%zW$W!%ՇC ŃyK;֕&%l=2ZhYɈJL2KnFؠSukHEs#۸ :xσ*PBD:!ugicf7nն\d#3~Qx2p ~7+c@"0x'[ Xqmf3 Sr]-aA'&M4, I:%_-I1mHjE ֦i rZ;&KW[Gwc$(-眵'cXOzu?}c e9 }="(eyJ4 lh>B:9gó}:_-,3Ǽ V\~)nV%>EItzn^ϗ*#cV,n3^wgd/JB,^n`9)%DRw,{?2 JH$7o;;nͻ)܅aƾuQOj$v(۹$ܚk:w;Qxp?l<ěE2TcHf:qlX?.^#؏yy_j%_aΗ" \bC4pS&0juN40A@Lā)hu75;qc*ĕofѷl|ug Og\v&uÜx6S#$fd3=te%: r$5C.$pWҧW5q΃?_M᧛؝i]S>>ۘ7jYٻ(ryj(9X?`i ?Uу>|=;!f&l4o*M qut^H+Y©D Z*#ZbHuQ04R -Be?`  YTU W6RIM 6؞<$' @qZM٬fn%'&d7?2 WܼoATky#QN0'D;t3[==sbq8EBάc9)( awmKmZ0mBr1G/ pq+ˁþ#,7up:WTfWjVBEPn'$Ad@ 9 2)! !B8Rar, Pvh\6P_Gg\B(`Ⱦ(jJfT(&:2#'pa>G6,@XI.$Dq#y(@X*b+"QlRm`siI&X~.-gBKkҹneIiΤ`\Z<\`ĭO QͥuL"(G'iT*v1y7Ȏ sOvG`jSRISKLǞK_얾X X%f6-#|s7M@ Gai _wm@2/x=wJaQQ/°Trb\u_.եw_^ E!ՃCz ]ijZQ;S!I];iԂ Iff8cqo+|ȏ\lۺC>JFڹNG[ko|֯8_&i]2;̀S># gwpiG%ZocU v$:^1yty8 8=٤L6ҐtMp&9n,uGVթ2֭GriӺOnMh WZ:E=c& |Ngn}"`3VukBC^ftxPI<|NI,sffӞhy^QԨ|y}QݿMWGrTOKdտNI}2~u gu]kijޟHjUK/ZK!RHWZzJ۬ԔayҋRŇ}("ԌkR,pH >MEmR/&iIobzH'*p3ŚFcqc@t(ynfɺ:y]ΪzzǙ=  %v-' rh!wZjBa49XӬ`JrXٓz8vYw`2>%#s? KPq ) X  5FT>nS8>?[yMXAZ%B8Q'{S&J R@ǾDf` r e6Z/kp Qϊd:] rJ-606OI}{$}u7AHRn=g)M͋uؼl5Q$p $@3( 8! ͱ 2!1r_0!EvBhK%\ :<u7 $X4`( cQd0` (C  $a0Hw߿~55_EJ^GmLlDaytesSL4TeE 8#h5`A`jf) DA xA4a'E)&n.jCUu H**7aejqM*CiIٛ _wޖ/XR=)PLr08vu֚X))gJ0p,?sRr+eS:|"x!Rfvאe̸]Bi)qAh)oRS͗SKAj֫)#P"k8nM#{< BŘZA JR] c B4mpfWb&\ǡY͖]J,W⹽ڰ{'qa֒ZC\BӅyzemu|4dS\w噸jY\ 5V{]]3$ʅ4\ze|{ c=G]}kD=us*' nڅ{>K~SIи}}ƐQPv~DBvpsزbf3x[Lc}+kwnOI xnEo7ۍ76ފ9sn,5A'a RbAW]<Ӷto)}jm(4;mL~Џ5͓sRh?!'$&|af`V`])m[?v̵;<撒TK'օMĞ7U/ue]'0שɕ3AdzBk+UO?>IyEhx?W1mߊY<֒W"TEqУXЮV3eדj3VG{8x%;&mȞaD?{W㶱a>Fk ubNXFRt83o5IIFGsxY]C L<ٮoɂQ|::j9aobMKRH6Y8P)/=I,Y_F B)0~*X <[_ R6XμfNGJ|hH& d!Ml=`F.Ղ劙wG/7w24;rN)'9] KR5EJm;+[ ǩۥs;)h)%MJ RJsU)Ǩۥ#-I)M)Z!IiJ.l)eMJJf|[!Ǩۥ I%K"'ԶyGdVOǩۥ I%J)nRեBVH1vLNJ/ZJ sR’?gSJQ}K5KyRJRKjnjIƿl)eMJ̝IA$f![3x~o==KNܶtO1VhnQhɿ=nhj Mn T5l@D~)ͅnSTn4oU:X 8l% 4V ahmdB\3)DևBk OQи>ߦB[*9X菲:nkĬSCuRSc11UJߺqTq 5WE 3Z0dO_=A>IQ×`A 6jبy`M`T*9-]]I$L:I2il?؃nZ4M CZ:,q2iX+dmq'%ꖫkKj#N.ITKLWYc)"W꫋~Q *k-\w:腄p~$qQ-/7Z!r9kxc݂s'ۥ3E^_&$ӥRJT\}NJ/RJ^8Za _ٷY'{DPvq!*Ie[A[UW}$u7JL9P%pjiu!1k*dv7݊F!u#Kj)%Tc>rƹ2ydn>+Wʛ+>q͑zx a!}& LRQ+?M=<̅:@JաFUc墿p9t;.m T{Ue`B̼Q.xv $stm 5&mMݠz% GdWhwyI$dw/ps&H03X$#O=b$}_ )opdq UZG^ nb^%7ϼ4 UuXjqMŀ2"VD[QƆn|B4<r|0V MxY]V5{M-8o+p QWL:9Vx$8E(0Brn7M<7O?"X  wVIs.-d  _dUKyb@RZ*`|?$Zwଔom7 Z*?aHMGnzMJko;'T`Ek"3%R`.8G72t Q܋ov8ULڥ]d+NERDa svZ~ pKNBXh|@0 "G$HQԗTixg(kmQCR*i7s>*x4(%|~|!Uqqnf%=4Ma2EaBӻ;=yA>giia\*}՟Ksb(L8Mx1OgFaJp3f0%&JƏ=&޳^r7Ǧ2wkп$Sy gs'?lxzc^#\*psό=72~{/"X"ןtu=7N  r_?'-:U^j<<\'lFY7WOnef!1džx{i ycY?6"sQ>zQ쯦F[oCo0m&;sw&݀@ kowun1[> &p>=<_; ?hٙސ:/2ۤUHʁx [hμd;57|oW?S!ݙ&ކRw5\haN\4]Mb/P_c ~mo_zG&oC4{V3^OFqycH5sc;`I6 _{SϏG"Y/d~}zi^WSKN&of6ϯ̾fc?M/(4^Ƽ17h5L(|x ޹1 q{zKBN&}#(V2%Qo,(,Y=!g毧e.!%dylw=| QXWr Dn7 tQbC$mp%*6UtX7qp;rPvq}\֧^e}uKE x]e vFNEn-_ڊ&@r[ X][4TpD!(Yo"ᾇd CC((6,jVmT\Y;ޢ;>Y+-ֽpo_ZH`{iBLy3#x "7Ns'?_y!ӡ7r@p7=H Yɛ>ldQby0c\p!`Ycn*vG(8oܿr~芽4SLa5Jɐ"qH ^>[̆ic1yۺ(-i,S{R*jDQ|ioFxjs.wn@,XT8v%Kh'r*@C eI(FCUPVYn8"PinF) ,{j9ih(]u{8 A'P`Xa (Br0/qA)yA%?K$?S !vn#l.2v Pbs b(卧",}m^wi @D*W=)^<aLĶN;U43.BYS"B$0=5P!AlxsBd 'RPjL=U\1Wny{fd=e_j>VGboi" Oo^gvȤdҁGϦ2(cXd d֍cR-2[6vA,w5.͏HkFI#Zd\Mƽ=b'=:k$s/͜F̠Z ui&4ndgBйv)/;Rc'w)d!%]͂,ն(4eTm"Ko>jPn_<.JSɥ6JQ-DmЭZo~z۹ ,W$wTs3;ʋ<=T]S [ 7Ǡ9j*$R2kHLmney":eQDoPujAڭ E2S@~zӤWŠ,lz|~an un։/WۼUs'MsyqOmh$@/GS<^/L_roD`jp2J} XZLo\L۫`D_,ÛL?xV7iL0FE0mHvlC TL[՟oI !*NItܵjŀjįD)Cn'gľ1hj9@ hXPA b!q' mv_K\3& hH"uD}<_#HKġa@Rʖ 5GMAөQB`TS1!ȧOq aq 鐅BJxFD>$DcM0FD#!Gѐza 8">,4}%"`bj8Z Iѩ*FJM%Ă?h蒠bIa9xAn.Ku*5Yo7>~5G[ V y21D>-|Oi:#B8b!B2% 8+5|pVq2YQ+&E>s5Q`14"! Lyԑ ^#aS-(Ȇ)m7@Hd%Vqy@vqbvB@eވ`ޕ5Gr;qbPԁK<G/P$od"FCXH_Jh׃{WDems~s_! K=*'We]Áf#4$/N-bde9Q b,ռCeQ7Q센7w_VAEbr ȩ}=i>TUNI:YVCB5sK|P dSBSi{*1.$tm|]|+1&>(Cm?f0ռ ?QŽM+9Syy`;W6'ޟ։z6 7lIڙ.V+ .a CdISg4圳KYȕ"i+J0/X;mvU`{mn (k {:\H-FMR M(A{~RJ {f.QEn5$Q85J1c,P_gB([$6M3+Psd2D1*E$$e匥6$Z|SV8a:c %N 4)s 2wS2YlzE&)yxFmWWw(vK/na$-.5ѭh/V@,lhάҤ$/5d$>=CFjG̿Z ﳚXz7w% ,7Zc@D;Ds')ԃQ0KXoT"`3+A Xڼk32fVB88[,vs3G~mT+SvMƨjhO#M1{s7>z-£e| {M<{=qa8uqcVK~, ?=&ծ 0U?0W% iқB,h0 |Q~߹ H+Iv|ZS!9ckӷ~G6VIM [6^3[kZKɇ>!j_N9卼 ?כbk_qE@A fϨC:Mא`DcH.!SFqu1x:NϨQ H>-һu7VBGD_˻Yi :m?wЦkHUޭ}@w!o1,="yF~NDefR ##ӂEᲯ٥.qB0)\ 5}nv|uлFimj[2% >5\,ؒjeb@4ºId3eq]]@q`t1MءQ%߱bLY]sEy1ィ~g~9_|9`ܤ!c5oF߄yAb'z8Jؿ&݌h*f ,&Yo~%>(C 9;uN$SfzSLsEKDxadةJķme8%%2ϿgWmW-},DZnpV^͆Jh!b8;.F/SL'3<`G6]֝PgkNG1XO)h\dPHzK!mtR&W$2dm&wLuYa,lZL&WuUAv\.CR^WmU[Œף5"CUSd"GRp.\A-ȿ@$JΔv$:)Ms+5bnq&$T9,ηO XԂ]}o_ 0`#2Bx[OJV0zo5KO-*"E}[+>~W׵{QM=ߩSI]S]/k[1^]?-éQw;w4͸YNwjYԇ밷~4\ٛ?@PXzSVh׳_wZ(_2Zk>/GN+OGfpG}Lђ+ʯ?%yڵR PLvNe "(T)ZH"( W]$-EyrМ ӨdTWKY;Tk-ր2pKÑRx?!e_xwpQx^|i*rKat[.Pa :h%~ tÑ KkDi>;$S/.A^hHSIOm.i"Llp6U-c>a ~iIeM]1,vode?mu|k0Dix^G]۰QckxvW}BSVM;daMuF΄BQnǠjmI7̕6ɻ1Z9!ygm! HӅsRcr Ha,%ʤVClOޭ?{ j囶(ik萏Tk)F%g.J *ϭ FʥL9קs4ѰW5pl]Xĩ`&$ ևEfcE&1*1LTBT$ $yaȷz*`6*" gH*:f@;6J4L]=vj]@Hi -T,l NS3~ml:SsAԼpև7{^5j.}j#ާg]>H5/ )%ȵKPQ:RoW@"WT 0f}k79 tc"\ר(4kXJSgݔQr&؊6j޹f|jiziި+h# I5mS#V5fZQj۝zPH#s m}VE}Hi-":H^C%;JRӃCy:ձ(j٭z8+ҭV=hԛ1‍,5yH`&J})靈HkWî/?PlVjͥFWXKWͥV }ZCp"{jy܋E8K BImF?פ֕C6kNjm?Ny`S5\Cd.! SGR{7KnmuQǻ. 6\XgEz.!SZTqy7]u@V1iu!w, gGz.!S Q׍㎞wqża#~*f\1 i?z< eEݗv}sdGmnNM^ u⥽ '#'JMr l0ٟmH3?oEr9yn'Ov?.wx7-9c𞖛?{R#܍*e 414%]d$#9,#yj-ko/^[l,BEBJ[_KLo"Yxxo!Pkؤ41" bHF\?Wh󤀒R}8g,O3:C.G AZK$q"-LǑPRg.,b+bGv)!"xy]M{HXB|Vj}qwp{Γ{'_.noWggxVN7懻sQע'_t^z(EC)8 "ġZ!|q4;<<_JRj{:nƥTS}%FRPZJN(=:=ϕ{>_Z:ݐ7JP e\ģ@)@Ja).=nC)b1l=RJ-lu'#J}R%ݓ@.W&cN7njRU>N#{#FVPz(Ց(3rkK"Xbӷ֡sAy\.qP|MR /^?}"ϳ2y,U%/4{thL OWꥈwN'O>|{$?__ĺYle}(xjc[)(͇$G3S2I5xy/i/תUPxV+|bx$T5:mk~[Vy`O) E5_w7W0K#*G C0LSÅqpb6ڬJ멘חD7j84he = 3jK*ELq|qJ)ׅFFC֐okxkV=i4'[ZܭUNz-K(0/li"o' Ե][.ӺeZ? aPJX6&rfR5h Kt="2]Q 7}\=3%E&rhT,I.s28H@֎'4$UF,.1pyFX`Eic%>KK9@r$2tLHBCY`f| V]p4h`zZ-0dx׋o\qnYu؟5xגWEr+ˇoo/Cyg+bH1]MPRsBh+XZ2w`UJ`rP2y_䪩ӡ3]VwQOE 3%0¯`r$,CMڲ"W̬4T I$4yNHK;ʰ&?/O^8iޑb$iJ=q&ďۧRxB!9?ܹG 9"W@ Vc?goμ/Xǒw+ײߞ\Ud:\&O=z}j&Qf6ϛ@J u}: _>wK.0FldGiyR2򎽔>UwT w0/DE꽥 K3`Ej`g~%'[nܸWXzq&z_T5f2SpRaiHP"eI*$ )6䇱Dg|z0U=rzG4f ci4-L?_&6%ܢHJ cOB~0#U娖DKJ6Pj)#:5DL;lE3#1C׳_Cf'' 7"0Bu%%gfu V{mxp7)|K|Cg:|܌gehЖE5]zYlB@{"j$֧u7 `Lp/'r[vgNlD+UK=acmoOnR:q<'Z)`+mΞ/`-=)R+kTB"`CN7y Q@n/Ogf6r5Tn'=Bvڍ_\éhݰ05im{1kv$!/\D}dJ; S< K]v;4GAs6Wal/PbymV8rPfu餔8z)Hs}R6ꅔ^0ţRQ/% ^R&4?;9R5VXw mP/Spc &$(Q-iLqEpɋx,΂epَǩ?&+y}>@ i@A /@P"VS(#D)e GH2F*%2K(be5%X:_9`,8Xڄ7b#i&CC蘭'rE1t 麀4,:w .oT6=^lhJU5ͭdZ4%V,ѵA:ȋ@LKX$fZv)wV^z֍9p747BY uR֔)QM ^G޶8\Yyfp.?ݱ )otϟ褚!G5Fo k Ϫ}>Ũ*Jqih>m,' 0> \;fVIyvYx9J)ch9αTR]NNpLe(/so` X'Q)/d~/^KN?e%?m:G!Bt~œ%^J0º 8dD9A?p+cN~V8Mb-uD2 <=_w)?6\'?|U4@s,'[6Ƌ)2Kd ?[|rr+H fZV9Em[+]C= RCm}d?|m2ǪgFB6*:Nb|;Xy1W]ÎĤR(Ep4 7(`L j08SV%yg2Ԍ)Bǔ]Xsp(fpOG'6X3 cб݄^fagZāW՜񪩼8:"1Npy?\)_-|򯚩)uhw̅Wt50 ; Ry 9upVI"SXK҄h-!d(1q&S_WL `"`q-8쉮>HEv9t"ݱ71ABQ$ w:l%D1Lv5(iDp &DE18APMUq$;TtlXf}T(bHMhK3$:*ҔdB)ThICEo@j#55Z#27NѺ"VoLL0W9eSU4Uja&@[S 7))Kx+eƇ`z9i׈˕ ~*IC 8[M˧'0[ǿ~xx"`=\{ϡsQe ;\n "Dj3P(tz{;`ۓ7X!0Mr]xcliwާm¡dؖ?]}>**`gk8EM/}  DmuC\;ʞ)X l2w^3pԶ<`׈AqjXTn a%9_[P\3r/b{O6X YEdy( w aBFz-|mhWsX!8P*KE%LI9N9e0Ww' Uk,r!P?^'..*׌|'W|I ||~u3ۓ NFͅ=c^do"4\Q)Li3QIN&=N=dJX)=^ 4ɷ@>". bdXL$J++WUI.ߐ,ngV D& 2:XA bX48U )`->?<@4̭ʄBr,ͶQ+ߔ8W7>56^x $tH E:_/Mh$$q-¶͟&mAEr 欋FRaҁ|}c(P&0ѝP^F@"F*w h"$T %4p)o:&A֭hr L`AĆ-k#s|7FtC{jFZ!m5:$nY͓x}LiC#i!X;AIN5/o&n(TeVui,QFT=yAӃsq? zhQuev?V~X`I%CtJoϯ0|8TtKx`7H(9Y1P1B3JKg[u}3J&? jHjsQD\c hn#W௑\Je#{WZauj0jp5[?\˩C3Z&j]CM4"CQ? B]T΢tOWfyJʙMxs'EWjz3l]޾85~OfiZt5WozvQ|n6@Ʀ}y:3#llp$W-uv26N~8߹oғ=f7]UE# y"%S\n1h7_Ml/)HbͿQu!!/\DdJ[MUNݼh(:^k4n'El,nܵpɔc .B`*\ +No&ҳOfMN͒lQٛ|zK[O\{ yz:PBMiUp 8E>η>7{< eT*6tAz]m[nn(aINZpLLAb-ѰeLLY) rba~ki[@m3gO>K$D9G&oZ^$ nX36yBb6[uvma11GhY =ο7&9Y&J:KX:-< XIhfdX!yv|LYo\5aKpL>[&yB.ڷe!:P' ȑZ-G^wGS^h(W4ぢj&N!0Qc m6\[M;]~N9t.cPS2cXFeL'$%JD8Z'  AT9! s = RTd*f(*&120eQfxL!"q$" ZȯR*.9bAuH3%A*KOCQFQHN1 J!osozNl`BK7"w<4laQ(i,::*ŕskbtM5ʝ.Ҷ@̺|0E|Rh{kQ8r{)O#T#G a!q[ Ry:~5"B1 h~_lHbZ8`$ͨ 8pv `hW1c]hv .j|3[g,wGI~eH{7WXUWJP.pG. ]u>N8/B1Ѫ#w]u#cJ"Q""E&MiX> .@4a2&TKmXDJpVRFB*C+ !@(cITḌn1҂G17iQiBmA3nPư !$ú vY@BB0@OYiB*ז{ փ_Ybe*l&knQ<5yEr 2+mEy`[ d1W[~(R/Hif.Ć!/K)= O S̋)6FkaVmYn(LЌ"HP Q@9Bc%88n.]rA;ojbq(tWW!bW"=Ј Qd)W4K맱w իV9t5Wl$6;\tvaO}h][o;+F^"Y O٧bg4xȒ#v-ʎӶ%nJn_XeS.wmp-j)rTg|z=wO쨍{F?s뜑,WܨR ko7Ƌ/kF+GI͉!dblLxJcR;Ϝ7鸥->L~QzjRjǎ9$uh͕9PzS1)_^u蝳K@H~"/:?sy7+n_K߻3p ie2yZ<2[g1(;]N`D#l j -WippCJ'x .QW@>N~'w>ubow*_tYض|l ;=kXϽuӠCV:)ǒ̪`EEr06]Nҏ(O'E+*Sri$=vrǴ=jhaU4UW:(}v{P?T;wWFÕ4iS9@]K6=|r^e*ᶇ+xn.}಻}#(ylN{l$<(DW=CQ9r؈i5^~^:dWq3xuxEk xoGME3uO"W'k{{:obDlqo}9ݹ&_]~X0>"'ExJғ9n{*ZyY*7op_^k/CmGsyo;7NAK2cЂk z >"iǖZF =4 eCH&j[ [\Pd0.B|!}y%eNIgDm,3ikls!k#.r%B:@P{m 4%rUOy( `牀T!s@2,1>)-މǃgю n?'Bmxt0`=>ȸF;7'ѪI/8U9FGB 0b&gLI[CiT/,8 \lg5v39ګ\Y9 "rH&wSzTQ8@%+&x\^Q )9tQQ^i"E򢘴b~˂̕*er7x(. F`qmOnАs6c1/mnVh ʙ!K^GN{]$xޓohwnk~}4(8WzK>QhhS 7*y|o桴![;}jhʰ{q*!I2'dRb.DوXNnr?zIctcܯ/.I<[BX>4ċ䎩5ũO*/) C!*$K86k-vd_W$}(ª|62ȡ4YurݴsE]i9K!Ip0hY-NzL;*8]T)P"Ϝ&2Ș9{(^YdI泤"&}@}Ȉ(:ջ7W<~ĸP]?*/^ {$]֏4jCןߦz+OQAHyw|Y?mV)̛_"..߾ 4EN}CeZq俯f3E(W蓋JS@:8|s2m{>|iԩߜ,?W9Um"W[櫍 ԜM{?'#I)+M0)(ȡ<_]hktɰ=-I3 {ZO iy۱q-Gp$j?nD-<8ic!FN3xL^2v:5@yʕc]ŵh6:HGkN,NtafNf_Ai)$5~uhT{@{&=fIM"yϐ- ^X&  :jIi\dFڑh`5Bl;4k>mLU4]Et T J"lPA-]̦p\G!_ YbLZNƶ7ѓΛ[挑T Pk\,"r`MکڑI+= Yp iuPΐBfɣ`5*Dõ8dzV6iB$hf>=T8;BF ťb&ýQ:"onV٤ im`hc2Jw~"o ;C/ 0ڌ€NF @%wȑ2YHZ3bTL:R|!!25¤ă)c@{!҄h$J56әo+WL56W* ݯD#f:\9:P(رir`.qZOsWΔ DB e!.ӤCi)#!,歚aYS%CFdZ2T s[:&F8FPV3qo.o{ }XϦ1dȚ*7o(dj(z'M%T6>َW7uUQ-Uahgz>MHFMeS-3Hu@@#ލN؃ -ڎXy wcn.i#߾|qmo`Bت'4 PE(fd8>pHLEpL$f$?QEBN+_wmmz;4?vvͱ }^$ًzHCD;NLUU}zORSuB,9 jd3"I!k%8YsBiJNAe#kL4DӁƽIsm^OV{h" Uz.Z2ýX S&S! }hJ.>w>0nu)v޽O! Z[I'#D(&.Oltƒ%3 bZ+r y9:E4od>6ﱂ*vؙ0;ԧ4<a8\gRjt )xVxO՗~/;փL޸")9u¤ŤU Ǣ؀X{7֙`mr LqAYrQ+*fUv.'<|Jң-sOeobDߠO]7&XP|Cץ.j X3zQ`H~2c Rʾ߭%50[Kj~Nftױk"fV9QܶO^I\ch{ިΚ$vnt;^%$!}Zӄz_nā#.D&:81NLlqHHgG֞Y*dY=Siw.:k0tFY `vn-(R'D>M? *+Ê"?>; 9mg v(邌ЯBmd( dveX:<LIPsZݶKxvь=Սk֩P/ݚ֪ڰ6)-N>|X+z2c:Mqw{,t[cu[7 YREЮBPid6hǝd|U ;yMm=w|bb;/ &]WƹJQQy.տw qZZv>kȕQS^L`U֎Hud?{71^?v|Rx>{sQ僮焪.ɅVF/nmwf~Y4a=h$4~(!7 4ЕAC ~)ٽLn>լLSYI qzN!~˭w^68g /9{ l6a ʺ2,`D ItTclq|.ϪIILe!M "Rbࢅ Ly)#!^e݀SQ k˩RSŌ+jLʕ2kYLT T1xoA LckK|迼5pH^pr-YcIhJ)Ŋ*u4VrO.\.Ke(?t3o dB9)d/zzŋS$ Or< 5<~/sv<rOؾ1J $KE~NQhۂoh}ic*8 ߧ8=*b(g% LA8KVr?|UL: t{tޭWAy ])ǔKZwh#3g' H~ѺZ ṶAP{h8ƠےDflxm$ϕ9~B EOFX % 2GLT[ub!~A9$oeAX03ϥCZ#^ :ŠO7ޘ @֓H3Y8YDpXoKAMoS\ܑCd* K2f7xzO!DŽH-J͎6ڭ*r\"߸KQ@3%RXLH0Gm |2LEz tVV/ GY<(8,2 1˅iꑢ #35ћ.ң "o%Hf2M ʐL p[>ੁ4/N^qw{oע ^LCy8N Ψ`=~TPL_` cp#ak8>7mJn'&w8\0Ӝp!̧0AW XPA+x+#y]Di< #&%݋z)¨B}]QNirFww. Wz>6ynOR9Cx֜٤F]<9TȆoxG\&ݚ:W,7>f+!9 E3>FCDP"@t1 q&({뉆l3x$08>!Mh:NiA`ܷplj>NfZQ}y'Rt BgB%0!3bB8O}X4c:X$%Sµ2S&!o \4dk$;ؐD A/Wu*aU}x^ "x4F?3 õa17BCOřy/im>(s_ TV^[%+"&JᘽRZxeԲYRث6!JT M>UߢRJ'FCBq2,=[du1X`̓Kw\X{;LAL\l-Y+d ۇg Tg 5l$2J:]e*E,<,b& /MqqGW)k)j3u"_N] Y-UadPSj!zZ~:ZͩW}PŽo.n4>n:R{vGYaWdޅDx{o~yzvԓ"-ocC#0yE'_d'ʼvkwFvjBH\XCL`$%+a^DI3Aq!ծE 1G`Se Jud]ci 6_ - #$'c pF&bZ$ht"HEWM+_sI`y@C$*%e9˵:1*RY47.yQ {5nv8_S{]32~1R82 ײ,Tjcс+9!\Ō\&b4 5V2Hl*B޺C߻rgAa_ro24B>|gl>;P{GNbUX ߜQL%U=2Ζ 0/wmMD sugo0讔lNNgۨ/3J1l~wARh)!+vH䕏0Y09׿R[oj`F=ힺ ˿yRb:G4. "N^Lqmg[<`Gx^G--AE ,*}f[,UiT [糷=Z̢\ǒ -$!GCQlL9iGl++HD?jRh &G\`l`QG&܊gz_rNΦeފ!gX, 2 6;mE?d$dYf-m+ vm]_UXŮ 7t(2lzb]K*`( 5SʇŹƗNLDHUwʘ<\y;f%;]M*d^й"ynr ǀrW0miͺ͘eq F܇H+wY Gp~nSNF^,rӌxE"~ĕB_ϵ =#K-P9xf'=?đ'k-Bf -B H6tKemw#2H}뫎fOĽaiB&xglK.;cm)vL =V"go߭znK9vj3?tJfܩ*]Sg/O?^u^᭿2EZ ܵP%Kq=$J䑎AY^w)'IpvVvB׋0;?zl0 !ħO 2!{qvp29P·^)7c9 SxouRACA> yXlT,ڒlكU;Xm@X_tog LmɉS WwtRM14xO7Wى|/m{wh4=1Sy0{WjiIȒZP6(گroܵ8L|/i}?tϡҞY{/}Ӛ(h-i W"Dt~ox[*bT'MἯ p][HֆpM)Nh=MaʃI}Fv(*?-+Һ!/\EWtJFr?lPOJ;ǃ:ZvԻ˟^xw05Cz]t42 VWX+ɗ<|Lՠ6r.œa$a r5yJWosĸ"Ѝa$C;+x]IÉq#3+hiNY&#J S3·#& .3kⲝًkSGT\JɟrEX;h]@얖R܄ͅ섥X>?BxD{dʼç(#n<*=BY$PS:+*F0CRtZ0!udcl`"}Lت,2g,w硜@d 1p \ =]T܉8uaa di2 &*NLKG20Bp8F>Y}f'c7#,h0iDFr҆µmb a2]QlW8'T/W$qT&Bd*l&]s3(?Q }༥b}%{H@g|pX14}1.RZY ] r!!is)(RK Κia&| W&fTS. G4*2B(,x! EH53QG7:jD_)GH1o %Eźdο(PP F+)^x) ]zKxgeh3z)h /H}<&)}chon/khH2T͇LdPZcR(Etwe?vW+-}n&h^.K]_ݒ.x1h92xe>HrG|S!V n:c\`O9bDQfb3erʤe#8,M~PΈ0B`=2'uriHF>"H h/FL4`ulzzzO$+ Gg`d ZppuRs\\y| 2o!5pU9>%E&-Cq;g/)b$C\u@',B K+ /f*}U-o#$w/;O#?7~ \LeC{h^p`ऎ,KdB vmU1/xRa}^XF^Ktt>HOi p]_R@w {rrMW" <I9.ɹYu#Z"wd:9t+zbj|zN-q;vvtr>^b"_Ii5(j(.75ekk.D, d|npEadbf*e~51YW:<`҉0~   yҙi,iJW~}ncHb6^Jc)NG9rWl XjJ+ Y;o\`i8sL锥"QxVhe`?5M#>UO4ogir[p6@5SL, I3(T X+T3G5C}FkUR֓^[XRD'tgTG-}ZJ+cKX X﯒_޾}b:gxnv>9ԄABf32Q6~6]*x*j-m~80Z6?SʫRX\Se%B.difydr{p2(_RZ&[$gkDEgDhm&LeB8*S$AA UC06Pkb-Dj{\mB`.ߌץBʝPJЇ߸6`3EX$* uNQNnz50z ϩ(ބژ'ut;oW=2oPߕE99=W(J7e(nk]p,I.޾Cz _e0jGJÎ/7yL*MRYcZŧH.3$d,LWFa,9 x^j1O)u;aP;j J$3^$9۟$gNq J$rg.9*]/9,JdK2"JFGK=Edk-`bRqM4decX;?8Zߌ(ZvsܘZ CiY6 iW|2K8#$<ĈINξLk0?̲yaWBbsMk32w9#qD*Gl kRfSR3-@(BL2P}8GK+~9BcVXYȂPCq[VnJq;k}^B^ N+Ax->?#go矟.ʬތmW!uj 3D!>.ƓLLoB'SPwz2H͌^_h<B۽R@r8TZw7]qMV^Z$RSrr"[Zߘ \˩o 4BZط^PD8 Una*n'c֕APY^6pdE+nN¼._5kIZF]PKoF@H7Ҧn{] z6er-ˏߤՕ-+`lzb\OFmIT'ynkjN)l'Hk\3<Pzǻr&w"#dtIjQLfTc C&ۃ(@1-w>4BX f$2cZ,7[b ͝ca%"ZuP}J5:'^ebZU0Ik)̾* -%\>q-Zp)8@̭v9'idC1wpWb))҄[8C\P jazHkAڪVzk T @:>GucQKW(j]UJ{M.d?rhҳUJ1uk+B,yC 1<-10)+?8 qt*'X^%Vz"< iC!Ȥ28h {},{w‘ [hBH*G8)Yu:5d*ΣGpKE鎫_n,"!HxN8ꠁ<:}%f*70EQa]8X|dr[;_AfM^I' nkߎ}:AAAA]{g70㋛`*/GˈpLoX][o9+F^3e^%@?fbC?5/rE=,9IY,T2F*΍#OMp>죊ByΐIrgn֦t~%kE<\FZ"K2E87jjmgȚIRs*M Q/%:V n[`Q5H,gg!srZ+υ,*bÕ gA2]ӒiuX4{| $BKƔ?Z %??{J:2E{mR2V.IE$LJPZNjA2AlљH1!Z`IlBk 4UHZu[t\@أUF5v)z{ۓ%g\ 'L{,/VzEW?z1`0P!^0)XbbE߮|D%|{ tRKC{K[j/\#֙qp}"lNn=9f^sי]2}1;.1SJuW[o=z?Kvܹ$r[CV' SM$=ĘL}PJѿYr_+`Δ(fOiKKaSOnţ= -,ʿ`tE4 7Sso8 ($b(>U1s8:@2@Jn ؽ: nw4EC|@ ٙ Ц[7.6[{AfW@zr{LUb^thkR7+ʽcj~YHZ*0:<;{K5{EF<;T=@nXX. txϟ| THiI)VSY.}-ZNcih.3\-G-"AJ4@)YXTZ1Y=:47<}ŬMpu\ࠖI)MOw-O=w7nc/i=褨YLG{~moCI$ۣH!rqw "^s=۔bԍ#l1R 1<,nzbr\Yned|uys9Gd^fLp]ϳu "a}]\]}JCqJ2q+ܮ(4.4 %uvTiDK##Ί^:¨h~Pj^gC&qFE(p3 PS/P ti!?g2-=v ߻W}Zތ;)?)3r^r^r^r5. ςgFHIHhBL"rց&k ƠA@VU[۳ef<=0Eo~mɜA.J/o#3܆z2~\*cTXӆuPce !6zQ)CDgS\` 1&g5"|RhAjnD&1?!H D1"8l(SQ7JH4;x]R":g3̈sA!B; 8J_#ugU! m֟=ד1 \/*IA͡To֍`oe|oA HKu%"-bށ  Ã!Tk3:\tK/3w[oZB;`u# ^ 8"uNj7o:`A w91v&<.\ZluٱF1(9mlj%Ӊj4\ݙ64~}]?d YW $_CS!$.۴auf]Zآcp>+4%O u5:jIx4 dhfEbZQ m 5*!ri \d͖V!1pO𓳢\5!U)?5n5i !=h PVfBRkt=#R˭۷oCڽWljU6UÓ{/g`'.ZJzֿz.ާTWOO\'cJwFJ˽{3{- *KpQٹ[Xi FnJ[ _jJaSSB=1MQ'zŕ#{&@Wc{I]wK.0;oN쟍 Bs2Pp\Pgِd6y־]'xw~c\ǦB) '8hlFũ)JoӂBxT{p%nGY`\Ө9TʍRQ+֗TEpmR"C-05t2( ^"0vR[O+UrVQ,ɿr<9=k $}l7) Q R%L>YlCã51L#1HßPWPϸ24M 3Q=hӽ'vM iQfFv|YFBsLdM ;u>Sk@f#9]{6Ӫ>:汰[$} fr⦄^KI*mֱ{Sӻ)a䶻5*Bޔ02+ᢔqQ{-Y08B`#WZ ,J5#l(CeXJHG Qm;6^aN+a/t{iX {RMA{RVT*@qg`n޷>MZș><${߲KyK2[wĄ`39F\De.W9T0 Rkƣ>lv!^2Et(Q+Aq_î.%a4X}燹ts + m>1` mJ5盇<qO1o8j//w!`f`*&uӐtl?|#wPfl#2qtX,:gYX,n`h?H}a>\f0ܗ%( !TB [V Msk%?hh6L(df*.6l|K:-Qs_9Z -icV!Qjɝ􂡒6}>ߙEy(&\im1y߱G b\Of MpƏspe1CjJ LxMFJs<7/r+3rH\r}u# TeշCc A%$!P0CoLRXy`pҗ~j9/=|i+ǎȇQNCSTBI4?Ukmɟۏ+Cpi\C1Ef%7~J2W0=87PXGZS#JRJ+Gw<2cyjhGKzsd,:͹{bх`X4{qx̆GŜ"0I~!g6&t%wzƭrdnZ)-.=mEQ{gzȑ_t;x g$ؙ ^mete'YŖ,mn[!,7u#YK(;F6:D>edZŗoAh.̝$q ("=x8X2ݶJke'J3AM1PO,WBiNGE"QL\Nq[P*DY\99Hʁ<"!oaUT AZ֦ɃKÔTɫ OԆ1w(W%#)HwQ5&D 9%-`]a6;UstE Rr+kb{!A:n=$BD=*6,vrCNk_2N[<"mgQs$2pGņK,ϬZXD^ob5ZR帟3ʯc.jj^f )RH퓋'/B䃦$bTYngP u~;[Ί*ʹ@<ӽ\{Ud׷=| h] }svQ _)JŢRr?އ$V'+[;]wk*w b?=YƓj{rqz\D;*/>uAPX܀x+,W=!:AftDU7Y序WaiU 79LdFnw`䊍˃8'{wXZQiW;'5ýkc4E:VqM_ ]wS7Eu+ؽ ׫`%_.;\_,֞;|êHj< fBIIX;SZX=cYG4fQ?jAq D0OG% mL?qE(I[+V5i`Ip{#4#}ߣM&Gt$@2WmI7dMن^%wp'?V=< l-jUĭ.DAmٹ}qtCn9EI1yyg<Ԍ`Ŏ-so[ !F/E/ͧ:I녀|<1BoZ3~(&2g7j`ϓWwn6+m߱y*wqQ}?фZSonFVotIoRnx VV?Aq-`ۧE.`DzU# ELծwOQJ1h4h":m߱vrSMAPu!!߸/S"~<Ι: ի^wzgyҊ yAPv/h*}G4OM oQa0qحBU~d&qV+4nծW(lΖt>>^N)^CbWjI4a~ K 'țBD2L8m$s^HWڀSQ *88Q"yqyE $t|u9K 3 jkь`Fh2̖d0S͓ '/Q M1s*[S+lW:/ny`|+n׮I_WmDBxυ*!"9/q] ESCf2Y?XnKθ<=z ^nw貔 ^Rk#K$h4 F˃Lіb7ˆ Qh nca"6jM+jOr{Q)/FB$+YP҈N5BA7,3 d?g ?.p\ 0k^` C  C$ fBWJ+N>^Dy3ac,J:Pyt! /DF Bp|E9VKb~zY}Qv\)T.FSޑpdˑ*G0o[lE+#5Ira @mIkv΢iJNY9e$p=%k::<\-;n4Viv>-;il|YReZ,xzKH^ˢ\x9e?Yݐ-mC(XZ>P!`dYBT S,G^%njfr6oW&)l:?8mΑ PsrmWh6FS'Ke|N[}ASڙ" P}f 6u٦0aJb41!Fŀm^\-|[m6d)皆ICb0HXf"JKe.(֟>ڞ[M 6t\od`OHJb<~sGἐsi~hCO)iKX(!ߝ϶>M<6PU|mM%T]jFcܙa_:`&vu44`9MBBqIӾS;$AMNlärj.QߞDMEp|h@COx]3_RMAuYv[ዎvA}3 {e]߷'1\q,QNe[aۺ  S1eassT9rZ 0Yu>iS۽p`9 F\΁[?ߺl=.ɬTMx T)&!c=?{zԦ#盫V[ǜSMvܖ{t4u荏)PIMh5$TJ,%Z#eRIE礂UVt[}T#g g'2QH}(K#(d= fUց`;\>GjN'6 @O1dtB{0Q|t-JV|¥3Be]-=3Ft%>ᲊ"'Xv!@p)+ 3.)[R5ZR! M5U^4:<L)FE>f0deww fꦼ4Ġ f)(SY1%Ie!:\+OY:zbS s_~sޭ<;nvթ{n`߯ {;ċf,z -Ѡ3otWt*`|yճom㵡k{:x#e{Ȗ Biӯﺇ:N.Pdc6,j6-7ɏq>_/>f21PaZ5ɽPV= -A1jX$wrJ4^-]/=n"k-_G_>дk?lLoMTo@[I[h,c+`)uaS!"<|آma['"}vg6S-Yi;{j֎;3D_~0D(8J@Nu4øW8NTn v߄egmި^wnڨ'DwNnTS C^lXj{Z0&zO{6ZI%يkG  }KU=% пh%)Z_5J|CeЅxw~ĨZ@ǰnClJK3~1Pt1'=b$>Xt~ctB^)lk YCt*.3;DǫzHJE;F1,䅛Mu<+jF^)\VuaSIz./Χ?EQϣQvYGpm\ص@IyV?qT O':J%bi!yWd ѹmğM)ϧ.'0\fU,`̕ZGUE."J03MK‘Ymirbh ̩t c)&@rʆЖ K1iQo*=ij<(iJS,&` wYaĘ cY&*ř*$WTeAʼ,)"l1ۡ5« *Eqqy2\P_1-^}j ʓr"d%6ѣ,;x-tv./ E8o<{:a ]trK?mv$ŷJϼ*39TзO\WTzBSl^1.D[z T5L 0a&ơ:eZ%4v_b@_<21-eŚg )6' Eڧ9}w^bPfN6&ǥH73 [%ɡN8>;ܭNV].X "C萋[-iN ւ4:sǍ@{dTZ҄^ a|36KK!J;ݾHPi?"X@`<嘷Ӿl}UH6_z[=` m/.)ESug}O?`Z_ 3;{"~m^cPOY췦o~6;pvhCh Ղr|R}LF%UUA*i?JƄZ4׶V5ZUqCLoOV0@G 0ᜃLwW[kXgݽy~U7@:ua"D2M-%sFiN9ͤ@Ef ?]ߤHES*zetQԆx.j߹CEEk̞Z$8:֪}^]nP#ݼGWV6<^g}1ekYA?B友eU,W:H<^A:QyE3Jh|"t\yJZѧGiSWof tɢ4f%'9)/~l͢FE<_Y~1㦚qS͸fܴx8=: \Z.K6iɕP4ǐx.aQG F9L('굡?sxXYWEDAlnFZtWI!haa+ Ňl^_@ `ϥk_U65+07Z.v4ll2F[g;f>a"G@&2չ,lQGSo?{_,.J a~^t\[*6,y%-F SYZO J CcKN3?ڳs-Nnuhfp 4u!Q.BU*g0eFPW=% åm}XqokeF 3\m[J |ڷ-mFf`}jW8/"-xabw4 /!L7I:fɻ3UeK|~HFAjuk @= BQ !)e`5@uћww_E(\@pV=󗙷zdT#W5ng7=ʮ&idUd-hK ~pm,;5Zl)PUb|jovrѵ-gҔomW# \TV{ zÍWOO@`;-1;,H@BAi~Kj0roOcz~S]&n?Acb?;F(IZ28e *ִMy7^$W7^^W+Ci"ۉ7S}Yvw?)n*1p(WN'ft ȰvDǽA4on9t~Qp3U٘>Uٸ揪iy8*->wNl(xE-6ZqqKD]j[T詨Q(bzQ)U$"COErVL"`U8z*v)E_]ǎAVX؇c WO7$%jvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000004761057515145077074017726 0ustar rootrootFeb 17 13:25:24 crc systemd[1]: Starting Kubernetes Kubelet... Feb 17 13:25:24 crc restorecon[4698]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:25 crc restorecon[4698]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 17 13:25:26 crc kubenswrapper[4804]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 13:25:26 crc kubenswrapper[4804]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 17 13:25:26 crc kubenswrapper[4804]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 13:25:26 crc kubenswrapper[4804]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 13:25:26 crc kubenswrapper[4804]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 17 13:25:26 crc kubenswrapper[4804]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.331832 4804 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341661 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341717 4804 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341728 4804 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341739 4804 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341749 4804 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341758 4804 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341767 4804 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341775 4804 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341783 4804 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341791 4804 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341800 4804 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341810 4804 feature_gate.go:330] unrecognized feature gate: Example Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341819 4804 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341827 4804 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341835 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341843 4804 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341851 4804 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341859 4804 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341868 4804 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341878 4804 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341888 4804 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341897 4804 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341907 4804 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341922 4804 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341934 4804 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341944 4804 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341954 4804 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341962 4804 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341971 4804 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341979 4804 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341990 4804 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341999 4804 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342009 4804 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342018 4804 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342030 4804 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342041 4804 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342054 4804 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342064 4804 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342080 4804 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342091 4804 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342100 4804 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342109 4804 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342117 4804 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342126 4804 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342138 4804 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342148 4804 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342159 4804 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342167 4804 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342175 4804 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342185 4804 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342193 4804 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342231 4804 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342240 4804 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342249 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342258 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342272 4804 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342282 4804 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342291 4804 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342299 4804 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342307 4804 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342314 4804 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342322 4804 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342330 4804 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342338 4804 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342349 4804 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342357 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342365 4804 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342373 4804 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342381 4804 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342389 4804 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342398 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342572 4804 flags.go:64] FLAG: --address="0.0.0.0" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342595 4804 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342612 4804 flags.go:64] FLAG: --anonymous-auth="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342625 4804 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342639 4804 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342649 4804 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342663 4804 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342677 4804 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342687 4804 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342696 4804 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342707 4804 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342721 4804 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342731 4804 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342740 4804 flags.go:64] FLAG: --cgroup-root="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342749 4804 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342757 4804 flags.go:64] FLAG: --client-ca-file="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342767 4804 flags.go:64] FLAG: --cloud-config="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342776 4804 flags.go:64] FLAG: --cloud-provider="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342784 4804 flags.go:64] FLAG: --cluster-dns="[]" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342796 4804 flags.go:64] FLAG: --cluster-domain="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342805 4804 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342814 4804 flags.go:64] FLAG: --config-dir="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342823 4804 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342833 4804 flags.go:64] FLAG: --container-log-max-files="5" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342844 4804 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342854 4804 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342862 4804 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342872 4804 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342881 4804 flags.go:64] FLAG: --contention-profiling="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342891 4804 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342899 4804 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342909 4804 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342918 4804 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342930 4804 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342939 4804 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342948 4804 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342957 4804 flags.go:64] FLAG: --enable-load-reader="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342967 4804 flags.go:64] FLAG: --enable-server="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342977 4804 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342988 4804 flags.go:64] FLAG: --event-burst="100" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342998 4804 flags.go:64] FLAG: --event-qps="50" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343007 4804 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343016 4804 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343025 4804 flags.go:64] FLAG: --eviction-hard="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343037 4804 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343045 4804 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343055 4804 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343068 4804 flags.go:64] FLAG: --eviction-soft="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343080 4804 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343092 4804 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343103 4804 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343114 4804 flags.go:64] FLAG: --experimental-mounter-path="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343125 4804 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343134 4804 flags.go:64] FLAG: --fail-swap-on="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343143 4804 flags.go:64] FLAG: --feature-gates="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343155 4804 flags.go:64] FLAG: --file-check-frequency="20s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343165 4804 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343174 4804 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343184 4804 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343193 4804 flags.go:64] FLAG: --healthz-port="10248" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343235 4804 flags.go:64] FLAG: --help="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343244 4804 flags.go:64] FLAG: --hostname-override="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343253 4804 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343262 4804 flags.go:64] FLAG: --http-check-frequency="20s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343272 4804 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343281 4804 flags.go:64] FLAG: --image-credential-provider-config="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343321 4804 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343331 4804 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343339 4804 flags.go:64] FLAG: --image-service-endpoint="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343349 4804 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343358 4804 flags.go:64] FLAG: --kube-api-burst="100" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343368 4804 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343378 4804 flags.go:64] FLAG: --kube-api-qps="50" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343387 4804 flags.go:64] FLAG: --kube-reserved="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343396 4804 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343405 4804 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343414 4804 flags.go:64] FLAG: --kubelet-cgroups="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343424 4804 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343432 4804 flags.go:64] FLAG: --lock-file="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343441 4804 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343450 4804 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343459 4804 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343475 4804 flags.go:64] FLAG: --log-json-split-stream="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343485 4804 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343494 4804 flags.go:64] FLAG: --log-text-split-stream="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343505 4804 flags.go:64] FLAG: --logging-format="text" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343514 4804 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343525 4804 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343533 4804 flags.go:64] FLAG: --manifest-url="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343542 4804 flags.go:64] FLAG: --manifest-url-header="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343557 4804 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343565 4804 flags.go:64] FLAG: --max-open-files="1000000" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343576 4804 flags.go:64] FLAG: --max-pods="110" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343585 4804 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343594 4804 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343603 4804 flags.go:64] FLAG: --memory-manager-policy="None" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343612 4804 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343621 4804 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343632 4804 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343642 4804 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343666 4804 flags.go:64] FLAG: --node-status-max-images="50" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343675 4804 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343685 4804 flags.go:64] FLAG: --oom-score-adj="-999" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343694 4804 flags.go:64] FLAG: --pod-cidr="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343703 4804 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343720 4804 flags.go:64] FLAG: --pod-manifest-path="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343729 4804 flags.go:64] FLAG: --pod-max-pids="-1" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343738 4804 flags.go:64] FLAG: --pods-per-core="0" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343749 4804 flags.go:64] FLAG: --port="10250" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343758 4804 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343768 4804 flags.go:64] FLAG: --provider-id="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343777 4804 flags.go:64] FLAG: --qos-reserved="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343787 4804 flags.go:64] FLAG: --read-only-port="10255" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343796 4804 flags.go:64] FLAG: --register-node="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343805 4804 flags.go:64] FLAG: --register-schedulable="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343815 4804 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343831 4804 flags.go:64] FLAG: --registry-burst="10" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343847 4804 flags.go:64] FLAG: --registry-qps="5" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343856 4804 flags.go:64] FLAG: --reserved-cpus="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343868 4804 flags.go:64] FLAG: --reserved-memory="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343881 4804 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343890 4804 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343899 4804 flags.go:64] FLAG: --rotate-certificates="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343908 4804 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343917 4804 flags.go:64] FLAG: --runonce="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343928 4804 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343937 4804 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343946 4804 flags.go:64] FLAG: --seccomp-default="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343956 4804 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343965 4804 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343976 4804 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343987 4804 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343996 4804 flags.go:64] FLAG: --storage-driver-password="root" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344005 4804 flags.go:64] FLAG: --storage-driver-secure="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344014 4804 flags.go:64] FLAG: --storage-driver-table="stats" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344022 4804 flags.go:64] FLAG: --storage-driver-user="root" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344032 4804 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344041 4804 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344050 4804 flags.go:64] FLAG: --system-cgroups="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344059 4804 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344072 4804 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344082 4804 flags.go:64] FLAG: --tls-cert-file="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344091 4804 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344106 4804 flags.go:64] FLAG: --tls-min-version="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344117 4804 flags.go:64] FLAG: --tls-private-key-file="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344128 4804 flags.go:64] FLAG: --topology-manager-policy="none" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344139 4804 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344151 4804 flags.go:64] FLAG: --topology-manager-scope="container" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344160 4804 flags.go:64] FLAG: --v="2" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344177 4804 flags.go:64] FLAG: --version="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344189 4804 flags.go:64] FLAG: --vmodule="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344231 4804 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344243 4804 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344509 4804 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344522 4804 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344532 4804 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344540 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344549 4804 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344557 4804 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344566 4804 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344576 4804 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344584 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344592 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344601 4804 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344610 4804 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344618 4804 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344630 4804 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344641 4804 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344653 4804 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344664 4804 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344673 4804 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344681 4804 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344689 4804 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344697 4804 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344705 4804 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344713 4804 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344720 4804 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344728 4804 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344736 4804 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344744 4804 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344751 4804 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344773 4804 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344781 4804 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344789 4804 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344796 4804 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344804 4804 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344814 4804 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344824 4804 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344834 4804 feature_gate.go:330] unrecognized feature gate: Example Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344842 4804 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344850 4804 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344859 4804 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344867 4804 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344875 4804 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344883 4804 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344891 4804 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344898 4804 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344906 4804 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344913 4804 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344921 4804 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344929 4804 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344936 4804 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344945 4804 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344952 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344960 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344968 4804 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344976 4804 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344984 4804 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344992 4804 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345000 4804 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345007 4804 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345015 4804 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345023 4804 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345033 4804 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345041 4804 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345048 4804 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345056 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345063 4804 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345071 4804 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345078 4804 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345086 4804 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345094 4804 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345104 4804 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345114 4804 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.345145 4804 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.357962 4804 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.358280 4804 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358431 4804 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358444 4804 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358450 4804 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358459 4804 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358465 4804 feature_gate.go:330] unrecognized feature gate: Example Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358470 4804 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358476 4804 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358484 4804 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358491 4804 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358498 4804 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358504 4804 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358510 4804 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358516 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358522 4804 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358530 4804 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358536 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358545 4804 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358556 4804 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358564 4804 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358571 4804 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358577 4804 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358584 4804 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358590 4804 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358597 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358603 4804 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358609 4804 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358616 4804 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358696 4804 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358707 4804 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358715 4804 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358723 4804 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358731 4804 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358737 4804 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358745 4804 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358754 4804 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358761 4804 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358767 4804 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358773 4804 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358779 4804 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358785 4804 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358792 4804 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358799 4804 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358804 4804 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358810 4804 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358816 4804 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358830 4804 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358836 4804 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358841 4804 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358847 4804 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358853 4804 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358859 4804 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358867 4804 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358883 4804 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358891 4804 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358898 4804 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358905 4804 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358912 4804 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358918 4804 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358925 4804 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358932 4804 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358940 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358946 4804 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358953 4804 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358958 4804 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358964 4804 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358970 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358975 4804 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358980 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358986 4804 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358992 4804 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359000 4804 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.359012 4804 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359306 4804 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359321 4804 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359329 4804 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359336 4804 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359343 4804 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359350 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359356 4804 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359364 4804 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359371 4804 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359378 4804 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359384 4804 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359391 4804 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359397 4804 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359403 4804 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359409 4804 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359414 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359420 4804 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359425 4804 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359431 4804 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359437 4804 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359446 4804 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359452 4804 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359457 4804 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359465 4804 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359475 4804 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359481 4804 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359487 4804 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359494 4804 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359499 4804 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359505 4804 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359512 4804 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359518 4804 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359541 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359546 4804 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359553 4804 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359559 4804 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359565 4804 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359571 4804 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359576 4804 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359582 4804 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359588 4804 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359594 4804 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359600 4804 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359606 4804 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359611 4804 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359619 4804 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359688 4804 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359703 4804 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359709 4804 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359717 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359723 4804 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359731 4804 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359738 4804 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359744 4804 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359750 4804 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359784 4804 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359793 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359800 4804 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359806 4804 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359814 4804 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359821 4804 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359827 4804 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359855 4804 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359861 4804 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359866 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359872 4804 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359878 4804 feature_gate.go:330] unrecognized feature gate: Example Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359884 4804 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359889 4804 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359895 4804 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359902 4804 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.359911 4804 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.361595 4804 server.go:940] "Client rotation is on, will bootstrap in background" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.367997 4804 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.368150 4804 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.370096 4804 server.go:997] "Starting client certificate rotation" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.370157 4804 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.371125 4804 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-26 11:16:25.096677686 +0000 UTC Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.371278 4804 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.393990 4804 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.396637 4804 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.397502 4804 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.414293 4804 log.go:25] "Validated CRI v1 runtime API" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.449646 4804 log.go:25] "Validated CRI v1 image API" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.451834 4804 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.457954 4804 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-17-13-20-50-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.457995 4804 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.477784 4804 manager.go:217] Machine: {Timestamp:2026-02-17 13:25:26.474271897 +0000 UTC m=+0.585691244 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:2305fbdc-66f1-473f-924a-04d713bb59e5 BootID:bf842257-95c9-4f3c-a5d3-b668d3623b7b Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:dc:c8:69 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:dc:c8:69 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:62:e1:f4 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:4f:09:a3 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:97:72:f4 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:d2:39:2c Speed:-1 Mtu:1496} {Name:eth10 MacAddress:e2:70:e4:1f:8d:ca Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:22:12:a5:16:8f:5b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.478089 4804 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.478368 4804 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.480526 4804 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.480716 4804 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.480764 4804 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.480989 4804 topology_manager.go:138] "Creating topology manager with none policy" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.481001 4804 container_manager_linux.go:303] "Creating device plugin manager" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.481587 4804 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.481624 4804 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.482618 4804 state_mem.go:36] "Initialized new in-memory state store" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.483071 4804 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.486548 4804 kubelet.go:418] "Attempting to sync node with API server" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.486572 4804 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.486619 4804 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.486641 4804 kubelet.go:324] "Adding apiserver pod source" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.486658 4804 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.491007 4804 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.492851 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.492930 4804 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.492980 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.493118 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.493258 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.495155 4804 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496664 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496688 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496695 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496723 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496735 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496743 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496751 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496763 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496772 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496781 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496801 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496810 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.499295 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.499876 4804 server.go:1280] "Started kubelet" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.501091 4804 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.501238 4804 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.501873 4804 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.501873 4804 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:26 crc systemd[1]: Started Kubernetes Kubelet. Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.512341 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.512404 4804 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.512822 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 18:58:11.345160868 +0000 UTC Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.514241 4804 server.go:460] "Adding debug handlers to kubelet server" Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.514716 4804 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.514266 4804 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.514248 4804 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.514931 4804 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.515693 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.515826 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.517349 4804 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="200ms" Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.516298 4804 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.146:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18950b8c7f5626bd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 13:25:26.499837629 +0000 UTC m=+0.611256966,LastTimestamp:2026-02-17 13:25:26.499837629 +0000 UTC m=+0.611256966,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.525364 4804 factory.go:55] Registering systemd factory Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.525400 4804 factory.go:221] Registration of the systemd container factory successfully Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.525855 4804 factory.go:153] Registering CRI-O factory Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.525890 4804 factory.go:221] Registration of the crio container factory successfully Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.526012 4804 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.526048 4804 factory.go:103] Registering Raw factory Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.526067 4804 manager.go:1196] Started watching for new ooms in manager Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.526776 4804 manager.go:319] Starting recovery of all containers Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.529774 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.529853 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.529872 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.529885 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.529897 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.529911 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.529923 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.529937 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.529982 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.529995 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530007 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530020 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530034 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530052 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530070 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530087 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530104 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530120 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530136 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530151 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530166 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530184 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530227 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530244 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530259 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530273 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530295 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530316 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530335 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530350 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530384 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530405 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530422 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530441 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530461 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530479 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530497 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530514 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530532 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530550 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530571 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530618 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530637 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530697 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530720 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530738 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530757 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530775 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530793 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530811 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530830 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530847 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530874 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530894 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530914 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530932 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530954 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530973 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530991 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531007 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531024 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531044 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531070 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531088 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531106 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531149 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531161 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531177 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531219 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531233 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531246 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531262 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531277 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531291 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531305 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531320 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531333 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531350 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531370 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531389 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531406 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531420 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531435 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531449 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531469 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531487 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531506 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535074 4804 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535122 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535141 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535156 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535171 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535186 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535307 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535327 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535341 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535355 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535369 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535382 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535397 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535410 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535423 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535437 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535453 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535466 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535486 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535505 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535520 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535548 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535563 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535576 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535591 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535603 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535618 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535633 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535647 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535660 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535675 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535688 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535701 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535716 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535731 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535744 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535760 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535773 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535787 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535799 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535813 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535826 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535837 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535850 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535864 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535875 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535888 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535902 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535914 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535931 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535943 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535954 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535968 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535982 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535994 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536008 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536021 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536034 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536046 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536059 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536072 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536085 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536098 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536111 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536124 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536138 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536150 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536162 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536176 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536188 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536221 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536235 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536249 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536261 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536275 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536288 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536300 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536313 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536327 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536340 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536353 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536367 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536380 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536394 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536407 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536420 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536431 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536443 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536458 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536471 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536485 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536498 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536511 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536525 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536538 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536551 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536565 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536588 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536601 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536613 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536625 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536638 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536651 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536662 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536677 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536689 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536703 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536718 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536732 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536746 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536761 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536779 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536793 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536809 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536825 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536841 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536855 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536869 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536883 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536896 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536912 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536925 4804 reconstruct.go:97] "Volume reconstruction finished" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536935 4804 reconciler.go:26] "Reconciler: start to sync state" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.546896 4804 manager.go:324] Recovery completed Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.560653 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.563731 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.563817 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.563837 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.565966 4804 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.565994 4804 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.566020 4804 state_mem.go:36] "Initialized new in-memory state store" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.570656 4804 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.572632 4804 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.572684 4804 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.572725 4804 kubelet.go:2335] "Starting kubelet main sync loop" Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.572783 4804 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.574158 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.574248 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.584341 4804 policy_none.go:49] "None policy: Start" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.585539 4804 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.585569 4804 state_mem.go:35] "Initializing new in-memory state store" Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.615331 4804 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.642598 4804 manager.go:334] "Starting Device Plugin manager" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.642667 4804 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.642684 4804 server.go:79] "Starting device plugin registration server" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.643332 4804 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.643368 4804 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.643533 4804 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.643656 4804 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.643664 4804 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.656146 4804 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.673341 4804 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.673481 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.675258 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.675327 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.675347 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.675601 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.675735 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.675805 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.677281 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.677320 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.677290 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.677360 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.677374 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.677336 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.677553 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.677592 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.677767 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.678506 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.678603 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.678625 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.679190 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.679243 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.679257 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.679369 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.679473 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.679511 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.680270 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.680315 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.680333 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.680546 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.680636 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.680673 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.680778 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.680810 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.680826 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.682518 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.682548 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.682561 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.682522 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.682633 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.682648 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.682818 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.682851 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.683756 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.683789 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.683803 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.719243 4804 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="400ms" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.738937 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739041 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739080 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739116 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739148 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739178 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739528 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739589 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739678 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739760 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739793 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739818 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739840 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739863 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739889 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.746488 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.748088 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.748155 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.748170 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.748221 4804 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.748822 4804 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.841753 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.841863 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.841938 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842001 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842079 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842084 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842007 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842247 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842302 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842562 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842639 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842784 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842800 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842894 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842924 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842956 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842991 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842931 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843125 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843275 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843374 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843379 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843492 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843498 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843535 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843281 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843685 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843876 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843996 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843935 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.949767 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.951532 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.951606 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.951628 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.951673 4804 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.952261 4804 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.025667 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.039753 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.067923 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.083049 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.089393 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:27 crc kubenswrapper[4804]: W0217 13:25:27.109953 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-3394a7f8c096efd845c065e2617b995b290896af079653a41de7aa4bacc5bdf4 WatchSource:0}: Error finding container 3394a7f8c096efd845c065e2617b995b290896af079653a41de7aa4bacc5bdf4: Status 404 returned error can't find the container with id 3394a7f8c096efd845c065e2617b995b290896af079653a41de7aa4bacc5bdf4 Feb 17 13:25:27 crc kubenswrapper[4804]: E0217 13:25:27.121008 4804 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="800ms" Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.352759 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.355078 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.355168 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.355183 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.355260 4804 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 13:25:27 crc kubenswrapper[4804]: E0217 13:25:27.356111 4804 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Feb 17 13:25:27 crc kubenswrapper[4804]: W0217 13:25:27.370971 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:27 crc kubenswrapper[4804]: E0217 13:25:27.371120 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:27 crc kubenswrapper[4804]: W0217 13:25:27.433415 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:27 crc kubenswrapper[4804]: E0217 13:25:27.433538 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.503257 4804 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.513409 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 08:45:28.131066668 +0000 UTC Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.578868 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0a7a934f78e281c8f88227737b7f30d54cb5ca058b47787a991facbf9592952e"} Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.580027 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3394a7f8c096efd845c065e2617b995b290896af079653a41de7aa4bacc5bdf4"} Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.580934 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6ac18128bfad11f4caf6ed0d0b5f6d02428aed2f1d6bebd0a585f011f1e8f3f7"} Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.581775 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c7635304167e905f1cb3b586b13f91d232901a8c76cf21458c9aa252bd6f3831"} Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.583076 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e67bc0e1272885d0d52ffb35751a295519092d92cdd212d3e948bda8734caaeb"} Feb 17 13:25:27 crc kubenswrapper[4804]: E0217 13:25:27.921733 4804 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="1.6s" Feb 17 13:25:27 crc kubenswrapper[4804]: W0217 13:25:27.927862 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:27 crc kubenswrapper[4804]: E0217 13:25:27.927997 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:28 crc kubenswrapper[4804]: W0217 13:25:28.092033 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:28 crc kubenswrapper[4804]: E0217 13:25:28.092130 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.156766 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.158431 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.158471 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.158481 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.158505 4804 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 13:25:28 crc kubenswrapper[4804]: E0217 13:25:28.159016 4804 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.504084 4804 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.514529 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 05:20:50.776575272 +0000 UTC Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.525314 4804 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 13:25:28 crc kubenswrapper[4804]: E0217 13:25:28.527298 4804 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.589237 4804 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44" exitCode=0 Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.589312 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44"} Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.589419 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.592015 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.592063 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.592076 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.596580 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930"} Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.596623 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696"} Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.596641 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7"} Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.596652 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d"} Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.596669 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.599916 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.599971 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.599992 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.602436 4804 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8" exitCode=0 Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.602548 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8"} Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.602601 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.603564 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.603585 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.603594 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.604652 4804 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ec7641c3e61e45ce165b538d77e41c41463fe218e5274bb57372944010127cd4" exitCode=0 Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.604687 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ec7641c3e61e45ce165b538d77e41c41463fe218e5274bb57372944010127cd4"} Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.604745 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.605621 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.605661 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.605676 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.607733 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.608480 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.608499 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.608509 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.608553 4804 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="31ae10eb113c4f6e69ec71e2ef5e301093278d304f6fd564c0bfaa68aae3df53" exitCode=0 Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.608714 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.609253 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"31ae10eb113c4f6e69ec71e2ef5e301093278d304f6fd564c0bfaa68aae3df53"} Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.609893 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.609927 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.609940 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.846505 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.870588 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:28 crc kubenswrapper[4804]: E0217 13:25:28.878233 4804 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.146:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18950b8c7f5626bd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 13:25:26.499837629 +0000 UTC m=+0.611256966,LastTimestamp:2026-02-17 13:25:26.499837629 +0000 UTC m=+0.611256966,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 13:25:29 crc kubenswrapper[4804]: W0217 13:25:29.480145 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:29 crc kubenswrapper[4804]: E0217 13:25:29.480304 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.503069 4804 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.515441 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 00:57:36.349367205 +0000 UTC Feb 17 13:25:29 crc kubenswrapper[4804]: E0217 13:25:29.523290 4804 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="3.2s" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.623661 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5b261eeb544e51ee2adb543542186c18759e667e1339fd724a3dce82b0391aba"} Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.623718 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6dbdcf29ae01c66c5b1c3ed9c4aa7b3a6451bb49e10d52c319a2744105303386"} Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.626420 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8"} Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.626481 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094"} Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.628905 4804 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="253431bca5d3b9f01e549f7c312eacf3f14ac51ef1c78bea9bb825f13ee2e119" exitCode=0 Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.629048 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"253431bca5d3b9f01e549f7c312eacf3f14ac51ef1c78bea9bb825f13ee2e119"} Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.629166 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.630700 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.630740 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.630754 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.632789 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.632817 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c16d90fa3ea6207e30c8c7d82c6d77586b791fbe1a490094e34f01371a61d89b"} Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.632797 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.633696 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.633723 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.633734 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.633833 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.633865 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.633878 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.759098 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.760593 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.760644 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.760658 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.760687 4804 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 13:25:29 crc kubenswrapper[4804]: E0217 13:25:29.762030 4804 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Feb 17 13:25:30 crc kubenswrapper[4804]: W0217 13:25:30.117683 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:30 crc kubenswrapper[4804]: E0217 13:25:30.117784 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.267985 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.504013 4804 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.516243 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 23:58:04.544354523 +0000 UTC Feb 17 13:25:30 crc kubenswrapper[4804]: W0217 13:25:30.629295 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:30 crc kubenswrapper[4804]: E0217 13:25:30.629398 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.640254 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ce8a58720e24f8e51cc0749b46bb8ae7a1ecd576acaf200cce798a109e2e46fe"} Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.640314 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.641232 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.641270 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.641285 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.644152 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533"} Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.644181 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c"} Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.644212 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc"} Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.644266 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.646434 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.646478 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.646488 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.648056 4804 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="70e01634eee46cf26502b33fa597c0d4e345be38e1f95e56ca07dba16ce6367e" exitCode=0 Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.648167 4804 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.648224 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.648856 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"70e01634eee46cf26502b33fa597c0d4e345be38e1f95e56ca07dba16ce6367e"} Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.648928 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.648974 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.650270 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.650306 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.650320 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.650286 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.650385 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.650397 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.651357 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.651387 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.651403 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:31 crc kubenswrapper[4804]: W0217 13:25:31.079585 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:31 crc kubenswrapper[4804]: E0217 13:25:31.079711 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.516416 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 12:30:48.596444804 +0000 UTC Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.657401 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.658653 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"541cb2bc26adf968b3e261905d4f54392f7e0f3fb688675af189c3feeae5296f"} Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.658741 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2febdf4e794d3ec70cd39dedfc469d822d38420a688d1f0599e3cc416851fdb3"} Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.658766 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"39b06f8f75ba5208ad068cb319c4f3b420d3ffb99f9fb677a10859dafd34f27a"} Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.658817 4804 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.658848 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.658914 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.658821 4804 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.659149 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.661881 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.661941 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.661972 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.661986 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.662055 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.662087 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.662106 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.661944 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.662890 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.516631 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 14:44:53.158254255 +0000 UTC Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.667025 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8dd69ec5e7306ece674a51d346fd95fc0733bdeb9623ac1bddd3d0f8a48cc421"} Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.667132 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.667132 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"539310a045c9a289318eb5035a3ba0c7f77907ff51dff0f5df5210e67acda4b5"} Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.667138 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.668774 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.668823 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.668840 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.670008 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.670052 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.670068 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.823629 4804 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.963061 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.965074 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.965130 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.965150 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.965234 4804 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.269098 4804 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.269298 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.517765 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 14:32:31.018845354 +0000 UTC Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.639842 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.640118 4804 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.640240 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.642164 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.642368 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.642413 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.669444 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.670822 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.670879 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.670923 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.785490 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.785758 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.787567 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.787624 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.787647 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.950585 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.951003 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.952896 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.952972 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.952992 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:34 crc kubenswrapper[4804]: I0217 13:25:34.518739 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 05:44:58.939205993 +0000 UTC Feb 17 13:25:34 crc kubenswrapper[4804]: I0217 13:25:34.581696 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 17 13:25:34 crc kubenswrapper[4804]: I0217 13:25:34.672864 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:34 crc kubenswrapper[4804]: I0217 13:25:34.674164 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:34 crc kubenswrapper[4804]: I0217 13:25:34.674263 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:34 crc kubenswrapper[4804]: I0217 13:25:34.674290 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:35 crc kubenswrapper[4804]: I0217 13:25:35.420876 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:35 crc kubenswrapper[4804]: I0217 13:25:35.421257 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:35 crc kubenswrapper[4804]: I0217 13:25:35.423329 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:35 crc kubenswrapper[4804]: I0217 13:25:35.423432 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:35 crc kubenswrapper[4804]: I0217 13:25:35.423462 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:35 crc kubenswrapper[4804]: I0217 13:25:35.518927 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 09:12:58.142182787 +0000 UTC Feb 17 13:25:36 crc kubenswrapper[4804]: I0217 13:25:36.519272 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 08:24:34.262556302 +0000 UTC Feb 17 13:25:36 crc kubenswrapper[4804]: E0217 13:25:36.656488 4804 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 13:25:37 crc kubenswrapper[4804]: I0217 13:25:37.519943 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 23:01:03.629208781 +0000 UTC Feb 17 13:25:37 crc kubenswrapper[4804]: I0217 13:25:37.613485 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:37 crc kubenswrapper[4804]: I0217 13:25:37.613665 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:37 crc kubenswrapper[4804]: I0217 13:25:37.615083 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:37 crc kubenswrapper[4804]: I0217 13:25:37.615169 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:37 crc kubenswrapper[4804]: I0217 13:25:37.615187 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:37 crc kubenswrapper[4804]: I0217 13:25:37.621019 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:37 crc kubenswrapper[4804]: I0217 13:25:37.683430 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:37 crc kubenswrapper[4804]: I0217 13:25:37.685111 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:37 crc kubenswrapper[4804]: I0217 13:25:37.685185 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:37 crc kubenswrapper[4804]: I0217 13:25:37.685238 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:38 crc kubenswrapper[4804]: I0217 13:25:38.520239 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 03:28:56.933397153 +0000 UTC Feb 17 13:25:39 crc kubenswrapper[4804]: I0217 13:25:39.521542 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 19:01:25.343788772 +0000 UTC Feb 17 13:25:40 crc kubenswrapper[4804]: I0217 13:25:40.522118 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 03:38:47.531707637 +0000 UTC Feb 17 13:25:40 crc kubenswrapper[4804]: I0217 13:25:40.899880 4804 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 17 13:25:40 crc kubenswrapper[4804]: I0217 13:25:40.900023 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.504079 4804 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.522795 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 20:07:22.40900653 +0000 UTC Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.538141 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.538346 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.539542 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.539582 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.539591 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.586890 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.904769 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.905929 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.905976 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.905989 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.920953 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 17 13:25:42 crc kubenswrapper[4804]: I0217 13:25:42.431543 4804 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 17 13:25:42 crc kubenswrapper[4804]: I0217 13:25:42.431609 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 17 13:25:42 crc kubenswrapper[4804]: I0217 13:25:42.438120 4804 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 17 13:25:42 crc kubenswrapper[4804]: I0217 13:25:42.438211 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 17 13:25:42 crc kubenswrapper[4804]: I0217 13:25:42.523688 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 02:04:19.878062397 +0000 UTC Feb 17 13:25:42 crc kubenswrapper[4804]: I0217 13:25:42.906909 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:42 crc kubenswrapper[4804]: I0217 13:25:42.908179 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:42 crc kubenswrapper[4804]: I0217 13:25:42.908229 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:42 crc kubenswrapper[4804]: I0217 13:25:42.908238 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:43 crc kubenswrapper[4804]: I0217 13:25:43.269446 4804 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 13:25:43 crc kubenswrapper[4804]: I0217 13:25:43.269513 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 13:25:43 crc kubenswrapper[4804]: I0217 13:25:43.524333 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 23:23:42.939656301 +0000 UTC Feb 17 13:25:44 crc kubenswrapper[4804]: I0217 13:25:44.524912 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 13:57:19.226521544 +0000 UTC Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.435079 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.435266 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.436730 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.436759 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.436768 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.440336 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.525633 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 17:55:45.442618736 +0000 UTC Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.916355 4804 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.916415 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.917518 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.917559 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.917570 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:46 crc kubenswrapper[4804]: I0217 13:25:46.526544 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 22:54:17.913843945 +0000 UTC Feb 17 13:25:46 crc kubenswrapper[4804]: E0217 13:25:46.656686 4804 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.421632 4804 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.423467 4804 trace.go:236] Trace[1758872461]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 13:25:33.913) (total time: 13510ms): Feb 17 13:25:47 crc kubenswrapper[4804]: Trace[1758872461]: ---"Objects listed" error: 13510ms (13:25:47.423) Feb 17 13:25:47 crc kubenswrapper[4804]: Trace[1758872461]: [13.510324875s] [13.510324875s] END Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.423496 4804 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.425542 4804 trace.go:236] Trace[1424273427]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 13:25:32.834) (total time: 14590ms): Feb 17 13:25:47 crc kubenswrapper[4804]: Trace[1424273427]: ---"Objects listed" error: 14590ms (13:25:47.425) Feb 17 13:25:47 crc kubenswrapper[4804]: Trace[1424273427]: [14.590838409s] [14.590838409s] END Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.425570 4804 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.425632 4804 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.428402 4804 trace.go:236] Trace[794050110]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 13:25:36.730) (total time: 10698ms): Feb 17 13:25:47 crc kubenswrapper[4804]: Trace[794050110]: ---"Objects listed" error: 10697ms (13:25:47.428) Feb 17 13:25:47 crc kubenswrapper[4804]: Trace[794050110]: [10.698059931s] [10.698059931s] END Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.428442 4804 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.429189 4804 trace.go:236] Trace[2072162229]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 13:25:34.608) (total time: 12821ms): Feb 17 13:25:47 crc kubenswrapper[4804]: Trace[2072162229]: ---"Objects listed" error: 12821ms (13:25:47.429) Feb 17 13:25:47 crc kubenswrapper[4804]: Trace[2072162229]: [12.821148164s] [12.821148164s] END Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.429234 4804 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.429374 4804 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.434307 4804 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.467875 4804 csr.go:261] certificate signing request csr-z8hrm is approved, waiting to be issued Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.471939 4804 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:53372->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.472045 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:53372->192.168.126.11:17697: read: connection reset by peer" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.472466 4804 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.472509 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.476937 4804 csr.go:257] certificate signing request csr-z8hrm is issued Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.500985 4804 apiserver.go:52] "Watching apiserver" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.503796 4804 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.504174 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.504692 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.504735 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.505102 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.506172 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.506570 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.505060 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.506651 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.506922 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.506990 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.510632 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.510869 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.511075 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.511392 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.511671 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.511756 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.511908 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.512420 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.514099 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.516477 4804 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.526722 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.526792 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.526828 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.526856 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.526884 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.526909 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.526929 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.526949 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.526975 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.526994 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.526953 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 12:45:48.479668538 +0000 UTC Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527012 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527113 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527169 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527208 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527278 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527331 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527381 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527415 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527454 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527501 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527521 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527536 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527575 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527799 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527851 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527882 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527893 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527970 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528060 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528099 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528134 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528167 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528325 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528364 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528395 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528415 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528424 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528507 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528539 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528571 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528605 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528631 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528659 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528686 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528719 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528745 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528771 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528794 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528818 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528843 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528868 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528893 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528924 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528956 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528981 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529008 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529035 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529060 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529085 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529115 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529140 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529166 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529201 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529241 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529266 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529295 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529320 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529349 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529375 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529400 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529425 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529452 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529477 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529501 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529528 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529549 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529572 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529597 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529627 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529652 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529675 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529699 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529724 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529750 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529772 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529817 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529841 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529865 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529890 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529913 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529938 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529962 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529987 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530013 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530036 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530064 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530089 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530112 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530134 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530158 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530184 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530664 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530708 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530727 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530747 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530766 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530786 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530805 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530824 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530842 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530859 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530878 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530897 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530917 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530963 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530981 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531001 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531017 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531035 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531053 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531072 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531090 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531156 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531172 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531189 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531208 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531248 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531268 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531284 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531302 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531318 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531335 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531355 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531373 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531396 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531411 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531429 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531446 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531465 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531484 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531501 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531519 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531538 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531555 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531573 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531592 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531670 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531688 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531707 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531723 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531740 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531759 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531776 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531795 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531814 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531832 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531851 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531869 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531890 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531912 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531934 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531953 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531972 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531990 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532011 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532028 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532046 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532064 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532080 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532098 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532118 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532136 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532155 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532172 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532188 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532209 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532265 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532287 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532305 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532322 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532341 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532360 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532393 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532410 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532428 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532446 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532463 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532482 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532500 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532518 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532534 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532551 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532568 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532585 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532607 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532625 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532644 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532661 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532678 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532695 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532736 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532765 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532791 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532812 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532835 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532853 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532878 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532900 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532926 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532948 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532969 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532989 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.533013 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.533032 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.533108 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.533120 4804 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.533132 4804 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.533143 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.533154 4804 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528565 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528680 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528893 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529112 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529351 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529573 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529669 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529919 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529951 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530124 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529248 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530636 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530810 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530856 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530891 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531015 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531033 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531196 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531523 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531885 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531924 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532254 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532552 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.541558 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532579 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532669 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532916 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.533276 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.533526 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.533537 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.533538 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.534245 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.534440 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.534594 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.536720 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.536925 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.537161 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.538590 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.538848 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.538929 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.539008 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.539212 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.539445 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.539486 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.539623 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.539690 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.540026 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.540044 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.540255 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.540270 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.540510 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.540963 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.541905 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.541137 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.541052 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.541823 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.542250 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.542575 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.542890 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.542902 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.543567 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.544062 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.544305 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.544398 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.545267 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.545384 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.545896 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.546142 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.546253 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.546264 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.547108 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.547178 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.548300 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.548503 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.548589 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.548877 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.549539 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.549822 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.550325 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.550542 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.550612 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.550894 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.550934 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.551003 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.551308 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.551334 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.551618 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.552157 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.552303 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.552628 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.554459 4804 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.554924 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.558757 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.559401 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.559533 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.559933 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.560620 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.560651 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.560663 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.561034 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.561528 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.561657 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.561774 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.562030 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.562110 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.562204 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.562277 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.562389 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.562368 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.562450 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.562747 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.562935 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.563248 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.563477 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.563561 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.563555 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.563925 4804 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.563962 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.564019 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:48.063992362 +0000 UTC m=+22.175411709 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.564297 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.564328 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.564342 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.564574 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.564911 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.566351 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:25:48.066329398 +0000 UTC m=+22.177748735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.566377 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.566404 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.566790 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.566956 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.567178 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.567463 4804 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.567604 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:48.067583329 +0000 UTC m=+22.179002666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.563999 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.567244 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.568048 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.568340 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.568535 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.569092 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.569742 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.569948 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.570051 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.567282 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.570250 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.570317 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.570762 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.570781 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.571405 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.571499 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.571579 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.571744 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.572177 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.572378 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.572561 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.573470 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.573543 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.574233 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.574706 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.574769 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.575106 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.576585 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.576699 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.576918 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.577015 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.577092 4804 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.577383 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.577508 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:48.077213254 +0000 UTC m=+22.188632801 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.577821 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.581021 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.581056 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.581076 4804 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.581141 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:48.081120162 +0000 UTC m=+22.192539499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.585073 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.585684 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.586415 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.590590 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.594486 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.596199 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-4fbbv"] Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.596470 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.596692 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4fbbv" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.597341 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.597530 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.598295 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.598432 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.598594 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.599004 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.599161 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.599740 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.600167 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.600439 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.600552 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.600611 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.600615 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.600974 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.601041 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.602877 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.603201 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.603435 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.603441 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.603465 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.603584 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.603862 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.603311 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.604406 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.605206 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.605267 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.605337 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.605533 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.605580 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.605705 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.606442 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.606425 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.609608 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.613169 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.614605 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.616572 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.627933 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.634732 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdwzf\" (UniqueName: \"kubernetes.io/projected/9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc-kube-api-access-tdwzf\") pod \"node-resolver-4fbbv\" (UID: \"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\") " pod="openshift-dns/node-resolver-4fbbv" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.634833 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc-hosts-file\") pod \"node-resolver-4fbbv\" (UID: \"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\") " pod="openshift-dns/node-resolver-4fbbv" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.634862 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.634937 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.634988 4804 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.634999 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635010 4804 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635020 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635028 4804 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635037 4804 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635045 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635054 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635062 4804 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635073 4804 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635082 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635093 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635102 4804 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635110 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635120 4804 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635130 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635139 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635147 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635156 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635164 4804 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635174 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635182 4804 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635193 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635206 4804 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635229 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635238 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635247 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635255 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635264 4804 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635272 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635280 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635290 4804 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635299 4804 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635307 4804 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635317 4804 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635326 4804 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635334 4804 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635344 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635353 4804 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635362 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635371 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635379 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635388 4804 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635396 4804 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635404 4804 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635412 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635432 4804 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635441 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635449 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635457 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635465 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635474 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635485 4804 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635493 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635501 4804 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635509 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635518 4804 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635526 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635533 4804 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635542 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635550 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635559 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635568 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635576 4804 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635584 4804 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635592 4804 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635600 4804 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635608 4804 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635616 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635626 4804 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635635 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635644 4804 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635653 4804 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635662 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635671 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635679 4804 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635687 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635695 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635703 4804 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635711 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635719 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635727 4804 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635736 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635745 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635754 4804 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635763 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635771 4804 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635780 4804 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635788 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635797 4804 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635805 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635813 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635821 4804 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635829 4804 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635837 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635845 4804 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635854 4804 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635862 4804 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635870 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635877 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635885 4804 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635894 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635902 4804 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635909 4804 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635918 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635927 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635935 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635943 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635951 4804 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635959 4804 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635967 4804 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635975 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635982 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635991 4804 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635999 4804 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636006 4804 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636015 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636025 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636041 4804 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636050 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636058 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636066 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636074 4804 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636082 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636090 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636098 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636106 4804 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636114 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636123 4804 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636132 4804 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636141 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636149 4804 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636157 4804 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636165 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636174 4804 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636182 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636191 4804 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636201 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636210 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636233 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636241 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636249 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636258 4804 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636267 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636275 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636284 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636292 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636301 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636310 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636318 4804 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636327 4804 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636339 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636347 4804 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636356 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636363 4804 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636371 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636379 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636386 4804 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636395 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636402 4804 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636411 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636418 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636426 4804 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636433 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636441 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636449 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636457 4804 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636465 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636473 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636481 4804 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636489 4804 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636497 4804 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636505 4804 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636513 4804 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636521 4804 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636529 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636537 4804 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636545 4804 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636554 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636562 4804 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636571 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636580 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636588 4804 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636600 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636608 4804 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636618 4804 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636674 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.637615 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.653935 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.665880 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.669010 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.686498 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.688152 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.711421 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.732460 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.737806 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdwzf\" (UniqueName: \"kubernetes.io/projected/9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc-kube-api-access-tdwzf\") pod \"node-resolver-4fbbv\" (UID: \"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\") " pod="openshift-dns/node-resolver-4fbbv" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.737857 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc-hosts-file\") pod \"node-resolver-4fbbv\" (UID: \"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\") " pod="openshift-dns/node-resolver-4fbbv" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.737888 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.737898 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.737907 4804 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.738060 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc-hosts-file\") pod \"node-resolver-4fbbv\" (UID: \"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\") " pod="openshift-dns/node-resolver-4fbbv" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.746118 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.756282 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.762927 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdwzf\" (UniqueName: \"kubernetes.io/projected/9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc-kube-api-access-tdwzf\") pod \"node-resolver-4fbbv\" (UID: \"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\") " pod="openshift-dns/node-resolver-4fbbv" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.765975 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.775656 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.785848 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.801502 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.811406 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.821928 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 13:25:47 crc kubenswrapper[4804]: W0217 13:25:47.837062 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-6c927af091ae30d91a1acecfccd94615057a900e7971c3f966ec4a9dcd0bc3f3 WatchSource:0}: Error finding container 6c927af091ae30d91a1acecfccd94615057a900e7971c3f966ec4a9dcd0bc3f3: Status 404 returned error can't find the container with id 6c927af091ae30d91a1acecfccd94615057a900e7971c3f966ec4a9dcd0bc3f3 Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.839368 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.848965 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 13:25:47 crc kubenswrapper[4804]: W0217 13:25:47.867431 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-31449f902d369e7839414ebdb4443b81c5d14622c2f5521174a7595d71bcefe3 WatchSource:0}: Error finding container 31449f902d369e7839414ebdb4443b81c5d14622c2f5521174a7595d71bcefe3: Status 404 returned error can't find the container with id 31449f902d369e7839414ebdb4443b81c5d14622c2f5521174a7595d71bcefe3 Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.924376 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"31449f902d369e7839414ebdb4443b81c5d14622c2f5521174a7595d71bcefe3"} Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.924856 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"17f5c3907bc12fd9996f7cdea92a1898f7ca89896d2057e34075d2373e742fc4"} Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.930155 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6c927af091ae30d91a1acecfccd94615057a900e7971c3f966ec4a9dcd0bc3f3"} Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.931870 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.934047 4804 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533" exitCode=255 Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.934114 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533"} Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.948410 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.961819 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.961889 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-zb7c5"] Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.963639 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.965362 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.965393 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.966406 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.966659 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.966944 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.974133 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4fbbv" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.974517 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.992702 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.006074 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.018947 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.036029 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.041272 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvrlj\" (UniqueName: \"kubernetes.io/projected/6992e22f-b963-46fc-ac41-4ca9938dda85-kube-api-access-jvrlj\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.041367 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6992e22f-b963-46fc-ac41-4ca9938dda85-rootfs\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.042272 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6992e22f-b963-46fc-ac41-4ca9938dda85-proxy-tls\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.042350 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6992e22f-b963-46fc-ac41-4ca9938dda85-mcd-auth-proxy-config\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.047132 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.057753 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.075618 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.084684 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.085776 4804 scope.go:117] "RemoveContainer" containerID="f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.087065 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.099837 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.114006 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.126279 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.140862 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.142738 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.142811 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.142844 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6992e22f-b963-46fc-ac41-4ca9938dda85-rootfs\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.142895 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6992e22f-b963-46fc-ac41-4ca9938dda85-rootfs\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.142938 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:25:49.142901571 +0000 UTC m=+23.254320908 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143102 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.143174 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvrlj\" (UniqueName: \"kubernetes.io/projected/6992e22f-b963-46fc-ac41-4ca9938dda85-kube-api-access-jvrlj\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.143260 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.143287 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6992e22f-b963-46fc-ac41-4ca9938dda85-proxy-tls\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143194 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.143326 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143345 4804 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143412 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:49.143390767 +0000 UTC m=+23.254810254 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.143348 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6992e22f-b963-46fc-ac41-4ca9938dda85-mcd-auth-proxy-config\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143471 4804 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143540 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:49.143529002 +0000 UTC m=+23.254948339 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.143475 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143609 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143639 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143639 4804 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143656 4804 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143706 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:49.143692907 +0000 UTC m=+23.255112424 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143730 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:49.143722038 +0000 UTC m=+23.255141605 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.144674 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6992e22f-b963-46fc-ac41-4ca9938dda85-mcd-auth-proxy-config\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.149037 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6992e22f-b963-46fc-ac41-4ca9938dda85-proxy-tls\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.164234 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvrlj\" (UniqueName: \"kubernetes.io/projected/6992e22f-b963-46fc-ac41-4ca9938dda85-kube-api-access-jvrlj\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.283614 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: W0217 13:25:48.304781 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6992e22f_b963_46fc_ac41_4ca9938dda85.slice/crio-c1f313cdb10593b38b07f0d1c97da5357eafea9076297ff6c32bb50afc7727db WatchSource:0}: Error finding container c1f313cdb10593b38b07f0d1c97da5357eafea9076297ff6c32bb50afc7727db: Status 404 returned error can't find the container with id c1f313cdb10593b38b07f0d1c97da5357eafea9076297ff6c32bb50afc7727db Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.350364 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-kclvs"] Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.350746 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.358310 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-4q55t"] Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.359895 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.362351 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.362592 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.362625 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.362914 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.363413 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.364002 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.364496 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.374054 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.394338 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.410712 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.434434 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.446726 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.446816 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-system-cni-dir\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.446840 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-cnibin\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.446881 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-cnibin\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.446904 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-run-k8s-cni-cncf-io\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.446926 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-cni-dir\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.446971 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-conf-dir\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.446997 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/526d243d-907b-44f6-a601-de8e86515a3c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447036 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-system-cni-dir\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447063 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-run-netns\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447101 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-os-release\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447123 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/526d243d-907b-44f6-a601-de8e86515a3c-cni-binary-copy\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447144 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-daemon-config\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447215 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-var-lib-cni-multus\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447240 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-etc-kubernetes\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447270 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-os-release\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447313 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-var-lib-cni-bin\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447339 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-socket-dir-parent\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447375 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-var-lib-kubelet\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447396 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-run-multus-certs\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447420 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-hostroot\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447458 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tnx7\" (UniqueName: \"kubernetes.io/projected/526d243d-907b-44f6-a601-de8e86515a3c-kube-api-access-5tnx7\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447491 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/42eec48d-c990-43e6-8348-d9f78997ec3b-cni-binary-copy\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447536 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvgkw\" (UniqueName: \"kubernetes.io/projected/42eec48d-c990-43e6-8348-d9f78997ec3b-kube-api-access-rvgkw\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.448263 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.462477 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.478351 4804 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-17 13:20:47 +0000 UTC, rotation deadline is 2026-12-30 17:08:56.707259972 +0000 UTC Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.478433 4804 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7587h43m8.228833529s for next certificate rotation Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.480490 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.491409 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.505050 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.517666 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.527455 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 08:57:55.052907176 +0000 UTC Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.534657 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.544958 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.548717 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/42eec48d-c990-43e6-8348-d9f78997ec3b-cni-binary-copy\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.548769 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvgkw\" (UniqueName: \"kubernetes.io/projected/42eec48d-c990-43e6-8348-d9f78997ec3b-kube-api-access-rvgkw\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.548802 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.548830 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-cnibin\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.548877 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-system-cni-dir\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.548906 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-cnibin\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.548939 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-run-k8s-cni-cncf-io\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.548971 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-cni-dir\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549004 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/526d243d-907b-44f6-a601-de8e86515a3c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549063 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-conf-dir\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549090 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/526d243d-907b-44f6-a601-de8e86515a3c-cni-binary-copy\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549117 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-system-cni-dir\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549143 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-run-netns\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549172 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-os-release\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549221 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-var-lib-cni-multus\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549249 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-daemon-config\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549290 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-etc-kubernetes\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549327 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-os-release\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549353 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-var-lib-cni-bin\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549384 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-socket-dir-parent\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549411 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-var-lib-kubelet\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549437 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-run-multus-certs\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549465 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-hostroot\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549490 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tnx7\" (UniqueName: \"kubernetes.io/projected/526d243d-907b-44f6-a601-de8e86515a3c-kube-api-access-5tnx7\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549552 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/42eec48d-c990-43e6-8348-d9f78997ec3b-cni-binary-copy\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549696 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-system-cni-dir\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549953 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-run-k8s-cni-cncf-io\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549992 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-cnibin\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550041 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-var-lib-kubelet\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550065 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-run-multus-certs\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550077 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-conf-dir\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550185 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-run-netns\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550235 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-etc-kubernetes\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550247 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-system-cni-dir\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550280 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-var-lib-cni-multus\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550263 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-socket-dir-parent\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550270 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-cni-dir\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550330 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-hostroot\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550308 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-cnibin\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550201 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-var-lib-cni-bin\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550415 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-os-release\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550505 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550820 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/526d243d-907b-44f6-a601-de8e86515a3c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550835 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-daemon-config\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550876 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-os-release\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.551017 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/526d243d-907b-44f6-a601-de8e86515a3c-cni-binary-copy\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.558574 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.567573 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tnx7\" (UniqueName: \"kubernetes.io/projected/526d243d-907b-44f6-a601-de8e86515a3c-kube-api-access-5tnx7\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.567702 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvgkw\" (UniqueName: \"kubernetes.io/projected/42eec48d-c990-43e6-8348-d9f78997ec3b-kube-api-access-rvgkw\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.574847 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.578364 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.578983 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.580371 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.581074 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.582386 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.583377 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.584274 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.585631 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.585794 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.586550 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.587633 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.588399 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.589408 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.590573 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.591259 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.592451 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.593081 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.594973 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.595988 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.596928 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.597760 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.598296 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.599434 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.599865 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.601087 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.601546 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.602636 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.602996 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.603339 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.606230 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.607042 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.608027 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.608553 4804 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.608662 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.610654 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.611637 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.612048 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.613660 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.614830 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.615417 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.615495 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.616563 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.617394 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.619302 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.619890 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.621039 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.621653 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.622545 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.623101 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.624144 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.624900 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.625834 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.626309 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.627167 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.627790 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.628368 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.629253 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.631659 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.644628 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.655979 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.666840 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.679192 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.686795 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: W0217 13:25:48.709023 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod526d243d_907b_44f6_a601_de8e86515a3c.slice/crio-8782643a2ab2b2033d39939a3f3fee58be5537e2fb1af1f6086b060414e525d1 WatchSource:0}: Error finding container 8782643a2ab2b2033d39939a3f3fee58be5537e2fb1af1f6086b060414e525d1: Status 404 returned error can't find the container with id 8782643a2ab2b2033d39939a3f3fee58be5537e2fb1af1f6086b060414e525d1 Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.715516 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v8mv6"] Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.716445 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.719410 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.719588 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.719754 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.719966 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.720125 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.720284 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.721751 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.745137 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751582 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-systemd-units\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751618 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-systemd\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751633 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8df4e52a-e578-472b-a6b3-418e9755714f-ovn-node-metrics-cert\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751652 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-openvswitch\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751666 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-netd\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751690 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-etc-openvswitch\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751703 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-ovn\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751719 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-log-socket\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751743 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-slash\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751758 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-ovn-kubernetes\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751775 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-kubelet\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751789 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-netns\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751803 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-var-lib-openvswitch\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751816 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-node-log\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751839 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751856 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-config\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751871 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-script-lib\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751885 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-bin\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751899 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-env-overrides\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751915 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75nhb\" (UniqueName: \"kubernetes.io/projected/8df4e52a-e578-472b-a6b3-418e9755714f-kube-api-access-75nhb\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.763597 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.780327 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.793651 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.812708 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.827480 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.846559 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852610 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-bin\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852638 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-env-overrides\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852659 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75nhb\" (UniqueName: \"kubernetes.io/projected/8df4e52a-e578-472b-a6b3-418e9755714f-kube-api-access-75nhb\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852704 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-systemd-units\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852721 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8df4e52a-e578-472b-a6b3-418e9755714f-ovn-node-metrics-cert\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852736 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-systemd\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852756 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-netd\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852777 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-openvswitch\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852800 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-ovn\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852814 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-log-socket\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852828 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-etc-openvswitch\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852850 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-slash\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852863 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-ovn-kubernetes\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852881 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-kubelet\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852899 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-netns\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852914 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-var-lib-openvswitch\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852929 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-node-log\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852957 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852974 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-config\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852994 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-script-lib\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.853741 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-script-lib\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.853794 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-bin\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.854092 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-env-overrides\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.854318 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-systemd-units\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.854911 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-var-lib-openvswitch\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.854955 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-netns\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.854969 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-slash\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.854994 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-ovn-kubernetes\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.855020 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-netd\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.855002 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-systemd\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.855049 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-ovn\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.855062 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-etc-openvswitch\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.855039 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-kubelet\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.855080 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.855105 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-node-log\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.855145 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-log-socket\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.854984 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-openvswitch\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.856056 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-config\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.861038 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8df4e52a-e578-472b-a6b3-418e9755714f-ovn-node-metrics-cert\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.863441 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.874244 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75nhb\" (UniqueName: \"kubernetes.io/projected/8df4e52a-e578-472b-a6b3-418e9755714f-kube-api-access-75nhb\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.885991 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.902492 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.916091 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.931319 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.939235 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.939298 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.939313 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"c1f313cdb10593b38b07f0d1c97da5357eafea9076297ff6c32bb50afc7727db"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.941130 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.945074 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.945911 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.963598 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kclvs" event={"ID":"42eec48d-c990-43e6-8348-d9f78997ec3b","Type":"ContainerStarted","Data":"26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.963665 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kclvs" event={"ID":"42eec48d-c990-43e6-8348-d9f78997ec3b","Type":"ContainerStarted","Data":"6c116c27328e56e79548ef582b32338297cd7d0dc0365e613e80b60106d64f54"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.967178 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.967252 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.969402 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.971168 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" event={"ID":"526d243d-907b-44f6-a601-de8e86515a3c","Type":"ContainerStarted","Data":"8782643a2ab2b2033d39939a3f3fee58be5537e2fb1af1f6086b060414e525d1"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.972948 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4fbbv" event={"ID":"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc","Type":"ContainerStarted","Data":"9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.972983 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4fbbv" event={"ID":"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc","Type":"ContainerStarted","Data":"acdd4ca1bd0f1e15b1086bca190cfff38491e5cb8e7e682e54bb3fb9aa4a2aec"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.973475 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.012856 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.044912 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.054802 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.098406 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.131499 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.161083 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.161257 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161294 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:25:51.161263315 +0000 UTC m=+25.272682672 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.161344 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161424 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161451 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.161452 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161465 4804 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.161493 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161559 4804 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161584 4804 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161618 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161634 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161677 4804 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161621 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:51.161600986 +0000 UTC m=+25.273020323 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161713 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:51.161705889 +0000 UTC m=+25.273125226 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161728 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:51.16172162 +0000 UTC m=+25.273140957 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161980 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:51.161970898 +0000 UTC m=+25.273390235 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.176145 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.215907 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.252418 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.292604 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.333148 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.376852 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.437482 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.451175 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.527680 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 19:47:25.287924316 +0000 UTC Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.528084 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.547609 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.572956 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.573105 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.573537 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.573598 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.573673 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.573716 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.592893 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.621460 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.657866 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.699812 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.737984 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.772454 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.815346 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.859853 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.894481 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.977245 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44" exitCode=0 Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.977380 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44"} Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.977472 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"a80f9c965ade76b1702626786407637ac7c475f156f06af4c297248b43c44248"} Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.978882 4804 generic.go:334] "Generic (PLEG): container finished" podID="526d243d-907b-44f6-a601-de8e86515a3c" containerID="3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f" exitCode=0 Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.978984 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" event={"ID":"526d243d-907b-44f6-a601-de8e86515a3c","Type":"ContainerDied","Data":"3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f"} Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.003076 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.030702 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.047813 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.063876 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.092986 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.134041 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.172828 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.214353 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.256898 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.273941 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.278430 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.293178 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.311392 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.352734 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.392869 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.435374 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.474304 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.514466 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.529092 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 11:20:01.320226111 +0000 UTC Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.560245 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.596319 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.631733 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.646669 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-z522z"] Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.647046 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-z522z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.680550 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.687105 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.704278 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.724656 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.744428 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.777640 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7d0b53df-b6de-4c33-a429-560638368e6c-serviceca\") pod \"node-ca-z522z\" (UID: \"7d0b53df-b6de-4c33-a429-560638368e6c\") " pod="openshift-image-registry/node-ca-z522z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.777744 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7d0b53df-b6de-4c33-a429-560638368e6c-host\") pod \"node-ca-z522z\" (UID: \"7d0b53df-b6de-4c33-a429-560638368e6c\") " pod="openshift-image-registry/node-ca-z522z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.777789 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d4nd\" (UniqueName: \"kubernetes.io/projected/7d0b53df-b6de-4c33-a429-560638368e6c-kube-api-access-8d4nd\") pod \"node-ca-z522z\" (UID: \"7d0b53df-b6de-4c33-a429-560638368e6c\") " pod="openshift-image-registry/node-ca-z522z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.801333 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.834214 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.872475 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.879269 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7d0b53df-b6de-4c33-a429-560638368e6c-host\") pod \"node-ca-z522z\" (UID: \"7d0b53df-b6de-4c33-a429-560638368e6c\") " pod="openshift-image-registry/node-ca-z522z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.879349 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d4nd\" (UniqueName: \"kubernetes.io/projected/7d0b53df-b6de-4c33-a429-560638368e6c-kube-api-access-8d4nd\") pod \"node-ca-z522z\" (UID: \"7d0b53df-b6de-4c33-a429-560638368e6c\") " pod="openshift-image-registry/node-ca-z522z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.879433 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7d0b53df-b6de-4c33-a429-560638368e6c-serviceca\") pod \"node-ca-z522z\" (UID: \"7d0b53df-b6de-4c33-a429-560638368e6c\") " pod="openshift-image-registry/node-ca-z522z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.879446 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7d0b53df-b6de-4c33-a429-560638368e6c-host\") pod \"node-ca-z522z\" (UID: \"7d0b53df-b6de-4c33-a429-560638368e6c\") " pod="openshift-image-registry/node-ca-z522z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.881455 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7d0b53df-b6de-4c33-a429-560638368e6c-serviceca\") pod \"node-ca-z522z\" (UID: \"7d0b53df-b6de-4c33-a429-560638368e6c\") " pod="openshift-image-registry/node-ca-z522z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.926650 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d4nd\" (UniqueName: \"kubernetes.io/projected/7d0b53df-b6de-4c33-a429-560638368e6c-kube-api-access-8d4nd\") pod \"node-ca-z522z\" (UID: \"7d0b53df-b6de-4c33-a429-560638368e6c\") " pod="openshift-image-registry/node-ca-z522z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.933325 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.973939 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.987030 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054"} Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.987074 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb"} Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.987084 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7"} Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.987095 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb"} Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.987105 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449"} Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.987114 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79"} Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.988365 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b"} Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.990431 4804 generic.go:334] "Generic (PLEG): container finished" podID="526d243d-907b-44f6-a601-de8e86515a3c" containerID="471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144" exitCode=0 Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.991009 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" event={"ID":"526d243d-907b-44f6-a601-de8e86515a3c","Type":"ContainerDied","Data":"471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144"} Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.011339 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.021633 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-z522z" Feb 17 13:25:51 crc kubenswrapper[4804]: W0217 13:25:51.041519 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d0b53df_b6de_4c33_a429_560638368e6c.slice/crio-797d47d823a56e165831c8a1c52730a61f7bfb6c709c1d1ad8b98b41912ede6b WatchSource:0}: Error finding container 797d47d823a56e165831c8a1c52730a61f7bfb6c709c1d1ad8b98b41912ede6b: Status 404 returned error can't find the container with id 797d47d823a56e165831c8a1c52730a61f7bfb6c709c1d1ad8b98b41912ede6b Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.050590 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.095403 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.133703 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.175189 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.182808 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.182935 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.182968 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183062 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:25:55.183025834 +0000 UTC m=+29.294445231 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183097 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183117 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183129 4804 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183181 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:55.183164158 +0000 UTC m=+29.294583495 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.183175 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.183242 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183300 4804 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183327 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183346 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183364 4804 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183327 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:55.183319953 +0000 UTC m=+29.294739290 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183426 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:55.183419067 +0000 UTC m=+29.294838404 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183840 4804 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183905 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:55.183887582 +0000 UTC m=+29.295306919 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.212837 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.255122 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.292465 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.333647 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.376494 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.412044 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.456826 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.494124 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.529611 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 09:59:25.176300906 +0000 UTC Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.534067 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.573694 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.573808 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.573853 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.573997 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.574062 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.574236 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.578924 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.996655 4804 generic.go:334] "Generic (PLEG): container finished" podID="526d243d-907b-44f6-a601-de8e86515a3c" containerID="dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29" exitCode=0 Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.997012 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" event={"ID":"526d243d-907b-44f6-a601-de8e86515a3c","Type":"ContainerDied","Data":"dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29"} Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.998871 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-z522z" event={"ID":"7d0b53df-b6de-4c33-a429-560638368e6c","Type":"ContainerStarted","Data":"0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd"} Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.998999 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-z522z" event={"ID":"7d0b53df-b6de-4c33-a429-560638368e6c","Type":"ContainerStarted","Data":"797d47d823a56e165831c8a1c52730a61f7bfb6c709c1d1ad8b98b41912ede6b"} Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.026750 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.045862 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.062121 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.077890 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.095377 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.117393 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.130035 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.143050 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.162973 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.189882 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.204610 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.217004 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.228250 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.241225 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.254387 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.265887 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.279904 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.295634 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.334286 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.373659 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.410044 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.450949 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.490309 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.530104 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 10:09:51.513226531 +0000 UTC Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.531891 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.570545 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.609570 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.651338 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.693746 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.007829 4804 generic.go:334] "Generic (PLEG): container finished" podID="526d243d-907b-44f6-a601-de8e86515a3c" containerID="33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02" exitCode=0 Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.007896 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" event={"ID":"526d243d-907b-44f6-a601-de8e86515a3c","Type":"ContainerDied","Data":"33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02"} Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.014428 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd"} Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.020704 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.032905 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.046671 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.062700 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.074262 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.085075 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.095483 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.107968 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.120185 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.132736 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.142915 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.176317 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.214636 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.253656 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.530397 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 05:43:58.390528444 +0000 UTC Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.573535 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.573606 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.573555 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:53 crc kubenswrapper[4804]: E0217 13:25:53.573826 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:25:53 crc kubenswrapper[4804]: E0217 13:25:53.573996 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:25:53 crc kubenswrapper[4804]: E0217 13:25:53.574460 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.830102 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.832313 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.832377 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.832395 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.832582 4804 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.842404 4804 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.842615 4804 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.843817 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.843884 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.843898 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.843925 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.843940 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:53Z","lastTransitionTime":"2026-02-17T13:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:53 crc kubenswrapper[4804]: E0217 13:25:53.856262 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.860405 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.860441 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.860450 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.860469 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.860482 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:53Z","lastTransitionTime":"2026-02-17T13:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:53 crc kubenswrapper[4804]: E0217 13:25:53.872699 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.877358 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.877412 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.877426 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.877487 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.877525 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:53Z","lastTransitionTime":"2026-02-17T13:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:53 crc kubenswrapper[4804]: E0217 13:25:53.896391 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.902657 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.902686 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.902694 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.902711 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.902722 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:53Z","lastTransitionTime":"2026-02-17T13:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:53 crc kubenswrapper[4804]: E0217 13:25:53.920304 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.924567 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.924602 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.924613 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.924633 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.924644 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:53Z","lastTransitionTime":"2026-02-17T13:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:53 crc kubenswrapper[4804]: E0217 13:25:53.936710 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: E0217 13:25:53.936818 4804 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.938590 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.938665 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.938692 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.938726 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.938748 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:53Z","lastTransitionTime":"2026-02-17T13:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.021536 4804 generic.go:334] "Generic (PLEG): container finished" podID="526d243d-907b-44f6-a601-de8e86515a3c" containerID="f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d" exitCode=0 Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.021585 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" event={"ID":"526d243d-907b-44f6-a601-de8e86515a3c","Type":"ContainerDied","Data":"f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d"} Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.042067 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.042580 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.042636 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.042659 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.042689 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.042711 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:54Z","lastTransitionTime":"2026-02-17T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.055307 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.067671 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.084006 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.097859 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.111581 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.123647 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.137918 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.145472 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.145522 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.145535 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.145557 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.145584 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:54Z","lastTransitionTime":"2026-02-17T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.153254 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.169840 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.193408 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.231092 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.248940 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.248987 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.248997 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.249018 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.249029 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:54Z","lastTransitionTime":"2026-02-17T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.263715 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.283518 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.351778 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.351817 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.351828 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.351843 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.351853 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:54Z","lastTransitionTime":"2026-02-17T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.454796 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.454833 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.454842 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.454858 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.454868 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:54Z","lastTransitionTime":"2026-02-17T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.531328 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 09:46:06.568427921 +0000 UTC Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.558300 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.558348 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.558359 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.558377 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.558386 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:54Z","lastTransitionTime":"2026-02-17T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.662478 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.662551 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.662575 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.662610 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.662635 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:54Z","lastTransitionTime":"2026-02-17T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.765297 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.765394 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.765430 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.765470 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.765494 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:54Z","lastTransitionTime":"2026-02-17T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.869724 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.869798 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.869827 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.869863 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.869891 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:54Z","lastTransitionTime":"2026-02-17T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.973842 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.973929 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.973951 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.973983 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.974003 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:54Z","lastTransitionTime":"2026-02-17T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.029856 4804 generic.go:334] "Generic (PLEG): container finished" podID="526d243d-907b-44f6-a601-de8e86515a3c" containerID="2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a" exitCode=0 Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.029912 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" event={"ID":"526d243d-907b-44f6-a601-de8e86515a3c","Type":"ContainerDied","Data":"2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a"} Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.046253 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.061622 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.077631 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.077685 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.077702 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.077722 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.077736 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:55Z","lastTransitionTime":"2026-02-17T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.079179 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.100188 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.119253 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.134131 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.146999 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.161728 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.179904 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.180602 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.180639 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.180653 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.180671 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.180685 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:55Z","lastTransitionTime":"2026-02-17T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.194601 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.207871 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.223286 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.228046 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.228189 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228230 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:26:03.228210125 +0000 UTC m=+37.339629462 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.228258 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.228306 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.228341 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228374 4804 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228428 4804 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228476 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:03.228449753 +0000 UTC m=+37.339869090 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228491 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228527 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228540 4804 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228545 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228563 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228575 4804 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228505 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:03.228496635 +0000 UTC m=+37.339915972 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228632 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:03.228611999 +0000 UTC m=+37.340031526 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228667 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:03.22866025 +0000 UTC m=+37.340079587 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.239596 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.259465 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.283384 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.283434 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.283450 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.283470 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.283484 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:55Z","lastTransitionTime":"2026-02-17T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.386324 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.386382 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.386394 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.386411 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.386422 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:55Z","lastTransitionTime":"2026-02-17T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.489353 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.489407 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.489418 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.489442 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.489454 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:55Z","lastTransitionTime":"2026-02-17T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.531572 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 12:23:24.405567501 +0000 UTC Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.573843 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.574053 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.574149 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.574356 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.574453 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.574609 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.593475 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.593565 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.593592 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.593626 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.593650 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:55Z","lastTransitionTime":"2026-02-17T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.697031 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.697087 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.697099 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.697117 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.697129 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:55Z","lastTransitionTime":"2026-02-17T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.800304 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.800647 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.800656 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.800670 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.800679 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:55Z","lastTransitionTime":"2026-02-17T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.903473 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.903510 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.903518 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.903531 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.903541 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:55Z","lastTransitionTime":"2026-02-17T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.006374 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.006422 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.006433 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.006451 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.006464 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:56Z","lastTransitionTime":"2026-02-17T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.036310 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" event={"ID":"526d243d-907b-44f6-a601-de8e86515a3c","Type":"ContainerStarted","Data":"eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.048515 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.048944 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.048983 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.057339 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.075607 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.081887 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.082614 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.091456 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.106294 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.109679 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.109726 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.109743 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.109762 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.109775 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:56Z","lastTransitionTime":"2026-02-17T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.118369 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.133970 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.148463 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.163939 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.179839 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.194295 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.211949 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.211989 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.211999 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.212013 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.212022 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:56Z","lastTransitionTime":"2026-02-17T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.212795 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.231914 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.254559 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.273165 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.287825 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.301613 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.315021 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.315070 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.315081 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.315098 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.315111 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:56Z","lastTransitionTime":"2026-02-17T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.319887 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.340990 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.358941 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.370654 4804 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.372448 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c/status\": read tcp 38.102.83.146:56570->38.102.83.146:6443: use of closed network connection" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.402323 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.416506 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.419537 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.419575 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.419584 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.419604 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.419613 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:56Z","lastTransitionTime":"2026-02-17T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.431153 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.450669 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.463811 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.476958 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.488638 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.499368 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.522390 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.522422 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.522430 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.522444 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.522453 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:56Z","lastTransitionTime":"2026-02-17T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.531761 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 16:34:49.350013425 +0000 UTC Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.588874 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.605714 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.618452 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.624332 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.624362 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.624371 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.624386 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.624396 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:56Z","lastTransitionTime":"2026-02-17T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.629775 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.640510 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.648537 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.667739 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.682709 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.695159 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.708882 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.723881 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.727492 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.727537 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.727563 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.727584 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.727596 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:56Z","lastTransitionTime":"2026-02-17T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.742018 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.754482 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.766376 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.830579 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.830649 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.830670 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.830702 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.830726 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:56Z","lastTransitionTime":"2026-02-17T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.934825 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.934928 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.934947 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.934970 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.934987 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:56Z","lastTransitionTime":"2026-02-17T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.037921 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.037984 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.038001 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.038026 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.038043 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:57Z","lastTransitionTime":"2026-02-17T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.053016 4804 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.141677 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.141771 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.141786 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.141806 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.141818 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:57Z","lastTransitionTime":"2026-02-17T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.245320 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.245365 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.245377 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.245394 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.245407 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:57Z","lastTransitionTime":"2026-02-17T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.348125 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.348189 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.348239 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.348267 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.348285 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:57Z","lastTransitionTime":"2026-02-17T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.450811 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.450856 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.450868 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.450883 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.450892 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:57Z","lastTransitionTime":"2026-02-17T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.532640 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 00:06:32.157674694 +0000 UTC Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.554009 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.554066 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.554078 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.554098 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.554111 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:57Z","lastTransitionTime":"2026-02-17T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.573305 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.573362 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.573393 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:57 crc kubenswrapper[4804]: E0217 13:25:57.573451 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:25:57 crc kubenswrapper[4804]: E0217 13:25:57.573570 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:25:57 crc kubenswrapper[4804]: E0217 13:25:57.573750 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.622053 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.656810 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.656860 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.656870 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.656892 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.656905 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:57Z","lastTransitionTime":"2026-02-17T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.759403 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.759463 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.759478 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.759502 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.759517 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:57Z","lastTransitionTime":"2026-02-17T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.862300 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.862347 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.862360 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.862380 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.862395 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:57Z","lastTransitionTime":"2026-02-17T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.965969 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.966011 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.966023 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.966040 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.966050 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:57Z","lastTransitionTime":"2026-02-17T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.068356 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.068425 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.068444 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.068472 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.068490 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:58Z","lastTransitionTime":"2026-02-17T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.171112 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.171197 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.171236 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.171255 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.171266 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:58Z","lastTransitionTime":"2026-02-17T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.273431 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.273509 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.273529 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.273560 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.273584 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:58Z","lastTransitionTime":"2026-02-17T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.376070 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.376226 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.376252 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.376280 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.376301 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:58Z","lastTransitionTime":"2026-02-17T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.479125 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.479236 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.479255 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.479281 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.479302 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:58Z","lastTransitionTime":"2026-02-17T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.533438 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 00:56:40.96450051 +0000 UTC Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.581537 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.581661 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.581676 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.581688 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.581698 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:58Z","lastTransitionTime":"2026-02-17T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.685069 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.685151 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.685169 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.685239 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.685262 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:58Z","lastTransitionTime":"2026-02-17T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.788594 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.788640 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.788651 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.788671 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.788682 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:58Z","lastTransitionTime":"2026-02-17T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.890891 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.890958 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.890980 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.891010 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.891033 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:58Z","lastTransitionTime":"2026-02-17T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.993968 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.994024 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.994039 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.994057 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.994070 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:58Z","lastTransitionTime":"2026-02-17T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.060346 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/0.log" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.065798 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963" exitCode=1 Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.065837 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963"} Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.066588 4804 scope.go:117] "RemoveContainer" containerID="9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.091566 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.096790 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.096836 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.096848 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.096864 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.096874 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:59Z","lastTransitionTime":"2026-02-17T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.107672 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.122403 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.135099 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.147723 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.160965 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.172206 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.184156 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.198044 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.200843 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.200881 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.200892 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.200912 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.200923 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:59Z","lastTransitionTime":"2026-02-17T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.213633 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.226009 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.237351 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.248007 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.263998 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:25:58Z\\\",\\\"message\\\":\\\"5:58.145600 6107 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.145961 6107 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 13:25:58.146060 6107 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.146122 6107 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.146233 6107 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 13:25:58.146467 6107 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 13:25:58.146849 6107 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.147451 6107 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 13:25:58.147524 6107 factory.go:656] Stopping watch factory\\\\nI0217 13:25:58.147547 6107 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:25:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.303369 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.303410 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.303423 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.303441 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.303452 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:59Z","lastTransitionTime":"2026-02-17T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.406105 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.406151 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.406161 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.406175 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.406185 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:59Z","lastTransitionTime":"2026-02-17T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.508487 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.508528 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.508537 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.508552 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.508561 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:59Z","lastTransitionTime":"2026-02-17T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.534285 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 22:09:56.281713514 +0000 UTC Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.573728 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.573760 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.573811 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:59 crc kubenswrapper[4804]: E0217 13:25:59.573905 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:25:59 crc kubenswrapper[4804]: E0217 13:25:59.573990 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:25:59 crc kubenswrapper[4804]: E0217 13:25:59.574081 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.611529 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.611570 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.611580 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.611599 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.611608 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:59Z","lastTransitionTime":"2026-02-17T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.714128 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.714170 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.714179 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.714217 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.714227 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:59Z","lastTransitionTime":"2026-02-17T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.816809 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.816849 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.816859 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.816874 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.816883 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:59Z","lastTransitionTime":"2026-02-17T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.922144 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.922230 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.922252 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.922271 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.922284 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:59Z","lastTransitionTime":"2026-02-17T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.025356 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.025405 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.025413 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.025430 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.025444 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:00Z","lastTransitionTime":"2026-02-17T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.074137 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/0.log" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.077974 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26"} Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.078352 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.100169 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.119280 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.129573 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.129622 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.129638 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.129665 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.129685 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:00Z","lastTransitionTime":"2026-02-17T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.139601 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.209603 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.224773 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.232643 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.232674 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.232686 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.232702 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.232712 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:00Z","lastTransitionTime":"2026-02-17T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.236031 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.247294 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.262118 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.278131 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.293491 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.308134 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.319486 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.333791 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.335635 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.335680 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.335698 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.335724 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.335742 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:00Z","lastTransitionTime":"2026-02-17T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.353343 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:25:58Z\\\",\\\"message\\\":\\\"5:58.145600 6107 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.145961 6107 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 13:25:58.146060 6107 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.146122 6107 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.146233 6107 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 13:25:58.146467 6107 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 13:25:58.146849 6107 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.147451 6107 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 13:25:58.147524 6107 factory.go:656] Stopping watch factory\\\\nI0217 13:25:58.147547 6107 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:25:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.439328 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.439385 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.439396 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.439419 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.439436 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:00Z","lastTransitionTime":"2026-02-17T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.535035 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 19:42:12.921915644 +0000 UTC Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.542139 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.542185 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.542232 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.542256 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.542273 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:00Z","lastTransitionTime":"2026-02-17T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.645006 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.645057 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.645074 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.645099 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.645112 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:00Z","lastTransitionTime":"2026-02-17T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.748241 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.748285 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.748297 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.748316 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.748327 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:00Z","lastTransitionTime":"2026-02-17T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.850135 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.850180 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.850189 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.850223 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.850235 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:00Z","lastTransitionTime":"2026-02-17T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.952211 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.952239 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.952247 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.952261 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.952269 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:00Z","lastTransitionTime":"2026-02-17T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.040406 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh"] Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.041104 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.043632 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.044384 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.059286 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.059359 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.059392 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.059421 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.059440 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:01Z","lastTransitionTime":"2026-02-17T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.061581 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.083315 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:25:58Z\\\",\\\"message\\\":\\\"5:58.145600 6107 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.145961 6107 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 13:25:58.146060 6107 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.146122 6107 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.146233 6107 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 13:25:58.146467 6107 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 13:25:58.146849 6107 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.147451 6107 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 13:25:58.147524 6107 factory.go:656] Stopping watch factory\\\\nI0217 13:25:58.147547 6107 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:25:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.084571 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/1.log" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.086021 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/0.log" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.092916 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26" exitCode=1 Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.092999 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26"} Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.093115 4804 scope.go:117] "RemoveContainer" containerID="9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.094690 4804 scope.go:117] "RemoveContainer" containerID="95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26" Feb 17 13:26:01 crc kubenswrapper[4804]: E0217 13:26:01.095110 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.098505 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.115697 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.130751 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.143880 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.158863 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.161294 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.161348 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.161360 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.161382 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.161395 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:01Z","lastTransitionTime":"2026-02-17T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.173295 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.189868 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.208865 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.226721 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.227277 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be1ee3c4-2152-421a-b39c-c1455968a17c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.227304 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be1ee3c4-2152-421a-b39c-c1455968a17c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.227542 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vd9x\" (UniqueName: \"kubernetes.io/projected/be1ee3c4-2152-421a-b39c-c1455968a17c-kube-api-access-6vd9x\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.227671 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be1ee3c4-2152-421a-b39c-c1455968a17c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.241242 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.256124 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.264891 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.264930 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.264940 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.264957 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.264970 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:01Z","lastTransitionTime":"2026-02-17T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.276503 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.292491 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.313706 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.329884 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be1ee3c4-2152-421a-b39c-c1455968a17c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.329970 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be1ee3c4-2152-421a-b39c-c1455968a17c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.330053 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vd9x\" (UniqueName: \"kubernetes.io/projected/be1ee3c4-2152-421a-b39c-c1455968a17c-kube-api-access-6vd9x\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.330121 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be1ee3c4-2152-421a-b39c-c1455968a17c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.331885 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.332796 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be1ee3c4-2152-421a-b39c-c1455968a17c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.332918 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be1ee3c4-2152-421a-b39c-c1455968a17c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.338496 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be1ee3c4-2152-421a-b39c-c1455968a17c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.351091 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.353451 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vd9x\" (UniqueName: \"kubernetes.io/projected/be1ee3c4-2152-421a-b39c-c1455968a17c-kube-api-access-6vd9x\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.363478 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.367279 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.367398 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.367445 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.367456 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.367481 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.367499 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:01Z","lastTransitionTime":"2026-02-17T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:01 crc kubenswrapper[4804]: W0217 13:26:01.378828 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe1ee3c4_2152_421a_b39c_c1455968a17c.slice/crio-1b29363c9f732760ea6068fcac77f95fb20c0fc0bcb6637a1dede733620e2d76 WatchSource:0}: Error finding container 1b29363c9f732760ea6068fcac77f95fb20c0fc0bcb6637a1dede733620e2d76: Status 404 returned error can't find the container with id 1b29363c9f732760ea6068fcac77f95fb20c0fc0bcb6637a1dede733620e2d76 Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.383667 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.401780 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.419851 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.435680 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.449507 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.462896 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.470104 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.470152 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.470167 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.470196 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.470230 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:01Z","lastTransitionTime":"2026-02-17T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.476644 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.489012 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.504054 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.523466 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:25:58Z\\\",\\\"message\\\":\\\"5:58.145600 6107 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.145961 6107 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 13:25:58.146060 6107 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.146122 6107 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.146233 6107 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 13:25:58.146467 6107 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 13:25:58.146849 6107 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.147451 6107 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 13:25:58.147524 6107 factory.go:656] Stopping watch factory\\\\nI0217 13:25:58.147547 6107 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:25:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:00Z\\\",\\\"message\\\":\\\"ork-node-identity-vrzqb\\\\nI0217 13:25:59.797105 6248 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 13:25:59.796953 6248 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 13:25:59.797140 6248 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0217 13:25:59.797005 6248 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797326 6248 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797334 6248 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-kclvs in node crc\\\\nI0217 13:25:59.797340 6248 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-kclvs after 0 failed attempt(s)\\\\nI0217 13:25:59.797345 6248 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-kclvs\\\\nF0217 13:25:59.797308 6248 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.535444 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 08:34:44.183370812 +0000 UTC Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.535581 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.572467 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.572518 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.572530 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.572552 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.572566 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:01Z","lastTransitionTime":"2026-02-17T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.572883 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.572971 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.573010 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:01 crc kubenswrapper[4804]: E0217 13:26:01.573108 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:01 crc kubenswrapper[4804]: E0217 13:26:01.573257 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:01 crc kubenswrapper[4804]: E0217 13:26:01.573371 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.689956 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.690016 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.690030 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.690058 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.690074 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:01Z","lastTransitionTime":"2026-02-17T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.793244 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.793308 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.793321 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.793346 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.793361 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:01Z","lastTransitionTime":"2026-02-17T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.816311 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-4jfgm"] Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.817077 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:01 crc kubenswrapper[4804]: E0217 13:26:01.817174 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.833056 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.846620 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.860034 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.871610 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.884019 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.896149 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.896188 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.896214 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.896232 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.896244 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:01Z","lastTransitionTime":"2026-02-17T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.899318 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.913668 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.927315 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.937902 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.937951 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm9vb\" (UniqueName: \"kubernetes.io/projected/e77722ba-d383-442c-b6dc-9983cf233257-kube-api-access-pm9vb\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.940769 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.953911 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.973222 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.986769 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.998337 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.998382 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.998393 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.998413 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.998423 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:01Z","lastTransitionTime":"2026-02-17T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.005704 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.022982 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.038759 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm9vb\" (UniqueName: \"kubernetes.io/projected/e77722ba-d383-442c-b6dc-9983cf233257-kube-api-access-pm9vb\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.038884 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:02 crc kubenswrapper[4804]: E0217 13:26:02.039048 4804 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:02 crc kubenswrapper[4804]: E0217 13:26:02.039133 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs podName:e77722ba-d383-442c-b6dc-9983cf233257 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:02.539108997 +0000 UTC m=+36.650528344 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs") pod "network-metrics-daemon-4jfgm" (UID: "e77722ba-d383-442c-b6dc-9983cf233257") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.046049 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:25:58Z\\\",\\\"message\\\":\\\"5:58.145600 6107 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.145961 6107 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 13:25:58.146060 6107 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.146122 6107 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.146233 6107 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 13:25:58.146467 6107 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 13:25:58.146849 6107 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.147451 6107 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 13:25:58.147524 6107 factory.go:656] Stopping watch factory\\\\nI0217 13:25:58.147547 6107 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:25:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:00Z\\\",\\\"message\\\":\\\"ork-node-identity-vrzqb\\\\nI0217 13:25:59.797105 6248 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 13:25:59.796953 6248 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 13:25:59.797140 6248 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0217 13:25:59.797005 6248 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797326 6248 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797334 6248 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-kclvs in node crc\\\\nI0217 13:25:59.797340 6248 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-kclvs after 0 failed attempt(s)\\\\nI0217 13:25:59.797345 6248 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-kclvs\\\\nF0217 13:25:59.797308 6248 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.056869 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm9vb\" (UniqueName: \"kubernetes.io/projected/e77722ba-d383-442c-b6dc-9983cf233257-kube-api-access-pm9vb\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.064713 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.098055 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/1.log" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.100104 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.100238 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.100316 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.100415 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.100511 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:02Z","lastTransitionTime":"2026-02-17T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.102268 4804 scope.go:117] "RemoveContainer" containerID="95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26" Feb 17 13:26:02 crc kubenswrapper[4804]: E0217 13:26:02.102499 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.103669 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" event={"ID":"be1ee3c4-2152-421a-b39c-c1455968a17c","Type":"ContainerStarted","Data":"ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.103729 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" event={"ID":"be1ee3c4-2152-421a-b39c-c1455968a17c","Type":"ContainerStarted","Data":"c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.103745 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" event={"ID":"be1ee3c4-2152-421a-b39c-c1455968a17c","Type":"ContainerStarted","Data":"1b29363c9f732760ea6068fcac77f95fb20c0fc0bcb6637a1dede733620e2d76"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.120397 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.142581 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:00Z\\\",\\\"message\\\":\\\"ork-node-identity-vrzqb\\\\nI0217 13:25:59.797105 6248 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 13:25:59.796953 6248 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 13:25:59.797140 6248 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0217 13:25:59.797005 6248 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797326 6248 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797334 6248 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-kclvs in node crc\\\\nI0217 13:25:59.797340 6248 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-kclvs after 0 failed attempt(s)\\\\nI0217 13:25:59.797345 6248 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-kclvs\\\\nF0217 13:25:59.797308 6248 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.160196 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.176296 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.191967 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.202975 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.203050 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.203060 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.203079 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.203089 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:02Z","lastTransitionTime":"2026-02-17T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.213073 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.224540 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.236539 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.247854 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.263127 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.280315 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.298555 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.307191 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.307255 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.307266 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.307287 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.307301 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:02Z","lastTransitionTime":"2026-02-17T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.312583 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.328256 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.345392 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.361660 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.378967 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.393104 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.409540 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.409603 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.409621 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.409647 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.409660 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:02Z","lastTransitionTime":"2026-02-17T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.411244 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.428691 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.454797 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:00Z\\\",\\\"message\\\":\\\"ork-node-identity-vrzqb\\\\nI0217 13:25:59.797105 6248 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 13:25:59.796953 6248 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 13:25:59.797140 6248 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0217 13:25:59.797005 6248 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797326 6248 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797334 6248 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-kclvs in node crc\\\\nI0217 13:25:59.797340 6248 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-kclvs after 0 failed attempt(s)\\\\nI0217 13:25:59.797345 6248 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-kclvs\\\\nF0217 13:25:59.797308 6248 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.472464 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.488640 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.509452 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.513599 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.513675 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.513695 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.513726 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.513744 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:02Z","lastTransitionTime":"2026-02-17T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.524922 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.535855 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 18:01:32.791514979 +0000 UTC Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.542596 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.545470 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:02 crc kubenswrapper[4804]: E0217 13:26:02.545765 4804 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:02 crc kubenswrapper[4804]: E0217 13:26:02.545908 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs podName:e77722ba-d383-442c-b6dc-9983cf233257 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:03.545871215 +0000 UTC m=+37.657290772 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs") pod "network-metrics-daemon-4jfgm" (UID: "e77722ba-d383-442c-b6dc-9983cf233257") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.561788 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.575583 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.591441 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.608993 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.617237 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.617281 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.617293 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.617317 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.617337 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:02Z","lastTransitionTime":"2026-02-17T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.627165 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.642515 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.721700 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.722319 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.722342 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.722375 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.722400 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:02Z","lastTransitionTime":"2026-02-17T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.825784 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.825835 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.825847 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.825865 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.825875 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:02Z","lastTransitionTime":"2026-02-17T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.928709 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.928781 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.928795 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.928811 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.928823 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:02Z","lastTransitionTime":"2026-02-17T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.032713 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.032806 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.032826 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.032864 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.032886 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:03Z","lastTransitionTime":"2026-02-17T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.136422 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.136492 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.136517 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.136548 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.136566 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:03Z","lastTransitionTime":"2026-02-17T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.239642 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.239714 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.239732 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.239762 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.239783 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:03Z","lastTransitionTime":"2026-02-17T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.254402 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.254708 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:26:19.254657596 +0000 UTC m=+53.366076973 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.254982 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.255188 4804 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.255446 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.255533 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:19.255494754 +0000 UTC m=+53.366914131 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.255611 4804 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.255993 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:19.25596904 +0000 UTC m=+53.367388377 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.255915 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.256306 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.256120 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.256638 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.256783 4804 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.256973 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:19.256948281 +0000 UTC m=+53.368367658 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.256431 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.257284 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.257425 4804 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.257600 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:19.257582272 +0000 UTC m=+53.369001649 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.343965 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.344054 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.344077 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.344116 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.344142 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:03Z","lastTransitionTime":"2026-02-17T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.447714 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.447794 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.447819 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.447857 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.447887 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:03Z","lastTransitionTime":"2026-02-17T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.536562 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 11:38:08.567342933 +0000 UTC Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.551361 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.551423 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.551442 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.551467 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.551484 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:03Z","lastTransitionTime":"2026-02-17T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.560483 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.560755 4804 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.560858 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs podName:e77722ba-d383-442c-b6dc-9983cf233257 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:05.560833729 +0000 UTC m=+39.672253076 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs") pod "network-metrics-daemon-4jfgm" (UID: "e77722ba-d383-442c-b6dc-9983cf233257") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.573817 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.573920 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.573927 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.574066 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.574272 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.574399 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.574599 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.574776 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.654797 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.654855 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.654870 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.654902 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.654920 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:03Z","lastTransitionTime":"2026-02-17T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.759376 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.759435 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.759452 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.759480 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.759498 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:03Z","lastTransitionTime":"2026-02-17T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.864004 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.864072 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.864090 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.864118 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.864141 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:03Z","lastTransitionTime":"2026-02-17T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.958690 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.967050 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.967098 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.967121 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.967149 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.967171 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:03Z","lastTransitionTime":"2026-02-17T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.977711 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:03Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.007421 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:00Z\\\",\\\"message\\\":\\\"ork-node-identity-vrzqb\\\\nI0217 13:25:59.797105 6248 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 13:25:59.796953 6248 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 13:25:59.797140 6248 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0217 13:25:59.797005 6248 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797326 6248 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797334 6248 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-kclvs in node crc\\\\nI0217 13:25:59.797340 6248 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-kclvs after 0 failed attempt(s)\\\\nI0217 13:25:59.797345 6248 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-kclvs\\\\nF0217 13:25:59.797308 6248 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.024844 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.047231 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.070024 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.070099 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.070120 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.070151 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.070176 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.070776 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.090945 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.107805 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.124788 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.135881 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.148379 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.167518 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.172661 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.172709 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.172722 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.172743 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.172755 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.181910 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.182040 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.182104 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.182188 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.182285 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.183720 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.197380 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: E0217 13:26:04.203545 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.208189 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.208341 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.208413 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.208495 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.208562 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.210406 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: E0217 13:26:04.225490 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.231712 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.231761 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.231774 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.231796 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.231812 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.232086 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: E0217 13:26:04.246489 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.247751 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.250806 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.250918 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.250988 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.251066 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.251135 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: E0217 13:26:04.269389 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.273428 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.273461 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.273472 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.273495 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.273509 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: E0217 13:26:04.285064 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: E0217 13:26:04.285333 4804 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.287360 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.287445 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.287463 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.287490 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.287504 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.390000 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.390082 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.390099 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.390140 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.390156 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.492533 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.492621 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.492638 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.492668 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.492705 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.538034 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 06:37:19.965498347 +0000 UTC Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.596085 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.596136 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.596148 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.596172 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.596187 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.699694 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.699775 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.699793 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.699823 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.699847 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.802540 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.802586 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.802597 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.802619 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.802631 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.905904 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.906007 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.906034 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.906096 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.906125 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.009885 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.009958 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.009968 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.009989 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.010004 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:05Z","lastTransitionTime":"2026-02-17T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.112764 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.112853 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.112869 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.112899 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.112916 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:05Z","lastTransitionTime":"2026-02-17T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.216179 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.216232 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.216242 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.216262 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.216275 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:05Z","lastTransitionTime":"2026-02-17T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.318919 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.318996 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.319009 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.319036 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.319053 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:05Z","lastTransitionTime":"2026-02-17T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.421416 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.421460 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.421471 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.421487 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.421497 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:05Z","lastTransitionTime":"2026-02-17T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.524056 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.524141 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.524166 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.524239 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.524263 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:05Z","lastTransitionTime":"2026-02-17T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.539478 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 11:31:38.817532964 +0000 UTC Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.573408 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.573464 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.573469 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.573408 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:05 crc kubenswrapper[4804]: E0217 13:26:05.573608 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:05 crc kubenswrapper[4804]: E0217 13:26:05.573741 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:05 crc kubenswrapper[4804]: E0217 13:26:05.574028 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:05 crc kubenswrapper[4804]: E0217 13:26:05.574125 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.584231 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:05 crc kubenswrapper[4804]: E0217 13:26:05.584363 4804 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:05 crc kubenswrapper[4804]: E0217 13:26:05.584411 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs podName:e77722ba-d383-442c-b6dc-9983cf233257 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:09.584398126 +0000 UTC m=+43.695817463 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs") pod "network-metrics-daemon-4jfgm" (UID: "e77722ba-d383-442c-b6dc-9983cf233257") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.626812 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.626883 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.626901 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.626933 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.626958 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:05Z","lastTransitionTime":"2026-02-17T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.730134 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.730192 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.730226 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.730249 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.730268 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:05Z","lastTransitionTime":"2026-02-17T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.833270 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.833316 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.833326 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.833342 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.833352 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:05Z","lastTransitionTime":"2026-02-17T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.937243 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.937320 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.937338 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.937364 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.937387 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:05Z","lastTransitionTime":"2026-02-17T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.041237 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.041341 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.041365 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.041399 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.041423 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:06Z","lastTransitionTime":"2026-02-17T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.144923 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.145003 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.145024 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.145057 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.145079 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:06Z","lastTransitionTime":"2026-02-17T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.248537 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.248619 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.248641 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.248675 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.248701 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:06Z","lastTransitionTime":"2026-02-17T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.352744 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.352841 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.352861 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.352906 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.352927 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:06Z","lastTransitionTime":"2026-02-17T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.456518 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.456596 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.456637 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.456678 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.456703 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:06Z","lastTransitionTime":"2026-02-17T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.540092 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 13:36:39.214837998 +0000 UTC Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.559539 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.559603 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.559624 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.559653 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.559671 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:06Z","lastTransitionTime":"2026-02-17T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.595717 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.614697 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.632688 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.662054 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.662124 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.662150 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.662181 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.662236 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:06Z","lastTransitionTime":"2026-02-17T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.668364 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:00Z\\\",\\\"message\\\":\\\"ork-node-identity-vrzqb\\\\nI0217 13:25:59.797105 6248 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 13:25:59.796953 6248 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 13:25:59.797140 6248 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0217 13:25:59.797005 6248 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797326 6248 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797334 6248 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-kclvs in node crc\\\\nI0217 13:25:59.797340 6248 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-kclvs after 0 failed attempt(s)\\\\nI0217 13:25:59.797345 6248 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-kclvs\\\\nF0217 13:25:59.797308 6248 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.685568 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.700444 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.713977 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.728850 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.742689 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.759330 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.764155 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.764251 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.764297 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.764321 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.764333 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:06Z","lastTransitionTime":"2026-02-17T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.774241 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.788920 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.799318 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.811117 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.828713 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.844263 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.867022 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.867063 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.867073 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.867088 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.867099 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:06Z","lastTransitionTime":"2026-02-17T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.969650 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.969691 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.969700 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.969713 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.969723 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:06Z","lastTransitionTime":"2026-02-17T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.072986 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.073036 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.073045 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.073059 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.073068 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:07Z","lastTransitionTime":"2026-02-17T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.175056 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.175089 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.175098 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.175111 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.175119 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:07Z","lastTransitionTime":"2026-02-17T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.277609 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.277650 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.277660 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.277676 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.277686 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:07Z","lastTransitionTime":"2026-02-17T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.380146 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.380248 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.380274 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.380306 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.380342 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:07Z","lastTransitionTime":"2026-02-17T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.483064 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.483097 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.483106 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.483125 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.483134 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:07Z","lastTransitionTime":"2026-02-17T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.540825 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 20:07:53.100965234 +0000 UTC Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.573356 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.573438 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.573522 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:07 crc kubenswrapper[4804]: E0217 13:26:07.573528 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.573400 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:07 crc kubenswrapper[4804]: E0217 13:26:07.573642 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:07 crc kubenswrapper[4804]: E0217 13:26:07.573769 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:07 crc kubenswrapper[4804]: E0217 13:26:07.573830 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.586036 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.586084 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.586101 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.586122 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.586142 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:07Z","lastTransitionTime":"2026-02-17T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.688921 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.688980 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.688996 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.689020 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.689038 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:07Z","lastTransitionTime":"2026-02-17T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.793008 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.793086 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.793104 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.793129 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.793147 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:07Z","lastTransitionTime":"2026-02-17T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.896869 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.896933 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.896957 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.896981 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.896998 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:07Z","lastTransitionTime":"2026-02-17T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.999682 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.999728 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.999743 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.999761 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:07.999772 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:07Z","lastTransitionTime":"2026-02-17T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.102567 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.102620 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.102631 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.102648 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.102657 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:08Z","lastTransitionTime":"2026-02-17T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.205918 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.205956 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.205965 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.205982 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.205991 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:08Z","lastTransitionTime":"2026-02-17T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.309041 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.309078 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.309087 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.309102 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.309111 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:08Z","lastTransitionTime":"2026-02-17T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.412251 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.412611 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.412712 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.412813 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.412907 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:08Z","lastTransitionTime":"2026-02-17T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.515980 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.516025 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.516035 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.516051 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.516060 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:08Z","lastTransitionTime":"2026-02-17T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.541151 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 21:25:41.222811274 +0000 UTC Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.619998 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.620407 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.620653 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.620799 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.620925 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:08Z","lastTransitionTime":"2026-02-17T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.722931 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.722969 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.722980 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.722998 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.723010 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:08Z","lastTransitionTime":"2026-02-17T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.826242 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.826284 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.826295 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.826310 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.826319 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:08Z","lastTransitionTime":"2026-02-17T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.928939 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.928982 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.928992 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.929009 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.929022 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:08Z","lastTransitionTime":"2026-02-17T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.032493 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.032527 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.032538 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.032555 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.032568 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:09Z","lastTransitionTime":"2026-02-17T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.135104 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.135212 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.135228 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.135252 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.135266 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:09Z","lastTransitionTime":"2026-02-17T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.238289 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.238321 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.238330 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.238346 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.238355 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:09Z","lastTransitionTime":"2026-02-17T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.341382 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.341463 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.341482 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.341516 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.341535 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:09Z","lastTransitionTime":"2026-02-17T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.444896 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.444952 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.444973 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.445008 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.445048 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:09Z","lastTransitionTime":"2026-02-17T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.542395 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 15:33:28.809565165 +0000 UTC Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.547615 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.547734 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.547753 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.547778 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.547798 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:09Z","lastTransitionTime":"2026-02-17T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.573700 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.573735 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.573773 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:09 crc kubenswrapper[4804]: E0217 13:26:09.573871 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.573984 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:09 crc kubenswrapper[4804]: E0217 13:26:09.576494 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:09 crc kubenswrapper[4804]: E0217 13:26:09.576851 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:09 crc kubenswrapper[4804]: E0217 13:26:09.577070 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.628834 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:09 crc kubenswrapper[4804]: E0217 13:26:09.629059 4804 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:09 crc kubenswrapper[4804]: E0217 13:26:09.629225 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs podName:e77722ba-d383-442c-b6dc-9983cf233257 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:17.629175075 +0000 UTC m=+51.740594412 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs") pod "network-metrics-daemon-4jfgm" (UID: "e77722ba-d383-442c-b6dc-9983cf233257") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.650853 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.650899 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.650917 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.650940 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.650958 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:09Z","lastTransitionTime":"2026-02-17T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.754553 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.754628 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.754647 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.754678 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.754697 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:09Z","lastTransitionTime":"2026-02-17T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.858020 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.858075 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.858085 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.858107 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.858117 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:09Z","lastTransitionTime":"2026-02-17T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.074274 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.074339 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.074351 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.074373 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.074388 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:10Z","lastTransitionTime":"2026-02-17T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.191389 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.191501 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.191517 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.191540 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.191552 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:10Z","lastTransitionTime":"2026-02-17T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.293612 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.293656 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.293665 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.293679 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.293688 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:10Z","lastTransitionTime":"2026-02-17T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.395962 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.395998 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.396006 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.396023 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.396034 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:10Z","lastTransitionTime":"2026-02-17T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.499056 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.499096 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.499106 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.499122 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.499132 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:10Z","lastTransitionTime":"2026-02-17T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.543356 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 02:49:37.536911094 +0000 UTC Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.601688 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.601717 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.601726 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.601742 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.601755 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:10Z","lastTransitionTime":"2026-02-17T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.704858 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.704888 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.704896 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.704910 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.704918 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:10Z","lastTransitionTime":"2026-02-17T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.806624 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.806668 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.806679 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.806694 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.806703 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:10Z","lastTransitionTime":"2026-02-17T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.909277 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.909313 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.909335 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.909349 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.909358 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:10Z","lastTransitionTime":"2026-02-17T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.012886 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.012940 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.012950 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.012977 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.012989 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:11Z","lastTransitionTime":"2026-02-17T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.115089 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.115146 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.115168 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.115185 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.115213 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:11Z","lastTransitionTime":"2026-02-17T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.217355 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.217445 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.217467 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.217503 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.217524 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:11Z","lastTransitionTime":"2026-02-17T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.319921 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.319969 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.319981 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.319999 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.320014 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:11Z","lastTransitionTime":"2026-02-17T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.422750 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.422793 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.422802 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.422824 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.422835 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:11Z","lastTransitionTime":"2026-02-17T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.526017 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.526073 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.526083 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.526098 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.526111 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:11Z","lastTransitionTime":"2026-02-17T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.543603 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 06:29:57.253933385 +0000 UTC Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.572945 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.573058 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.573080 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:11 crc kubenswrapper[4804]: E0217 13:26:11.573156 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.573331 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:11 crc kubenswrapper[4804]: E0217 13:26:11.573331 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:11 crc kubenswrapper[4804]: E0217 13:26:11.573427 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:11 crc kubenswrapper[4804]: E0217 13:26:11.573496 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.628607 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.628656 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.628669 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.628690 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.628705 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:11Z","lastTransitionTime":"2026-02-17T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.730738 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.730805 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.730818 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.730839 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.730859 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:11Z","lastTransitionTime":"2026-02-17T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.833547 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.833604 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.833615 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.833633 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.833645 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:11Z","lastTransitionTime":"2026-02-17T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.936019 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.936065 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.936073 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.936089 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.936100 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:11Z","lastTransitionTime":"2026-02-17T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.038990 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.039052 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.039064 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.039083 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.039096 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:12Z","lastTransitionTime":"2026-02-17T13:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.141071 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.141124 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.141137 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.141153 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.141163 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:12Z","lastTransitionTime":"2026-02-17T13:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.243822 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.243863 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.243875 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.243892 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.243905 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:12Z","lastTransitionTime":"2026-02-17T13:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.346365 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.346432 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.346445 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.346462 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.346473 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:12Z","lastTransitionTime":"2026-02-17T13:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.491176 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.491244 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.491257 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.491275 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.491285 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:12Z","lastTransitionTime":"2026-02-17T13:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.543753 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 20:40:55.22632969 +0000 UTC Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.593626 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.593675 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.593687 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.593707 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.593721 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:12Z","lastTransitionTime":"2026-02-17T13:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.697051 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.697456 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.697525 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.697599 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.697670 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:12Z","lastTransitionTime":"2026-02-17T13:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.800743 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.800797 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.800810 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.800830 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.800840 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:12Z","lastTransitionTime":"2026-02-17T13:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.904168 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.904299 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.904326 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.904364 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.904390 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:12Z","lastTransitionTime":"2026-02-17T13:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.007505 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.007554 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.007564 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.007583 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.007592 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:13Z","lastTransitionTime":"2026-02-17T13:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.110596 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.110651 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.110663 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.110684 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.110699 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:13Z","lastTransitionTime":"2026-02-17T13:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.214280 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.214360 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.214393 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.214430 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.214456 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:13Z","lastTransitionTime":"2026-02-17T13:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.318479 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.318568 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.318591 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.318626 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.318647 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:13Z","lastTransitionTime":"2026-02-17T13:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.422519 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.422609 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.422633 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.422673 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.422703 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:13Z","lastTransitionTime":"2026-02-17T13:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.526871 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.526950 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.526969 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.527000 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.527028 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:13Z","lastTransitionTime":"2026-02-17T13:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.544298 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 22:01:47.9271917 +0000 UTC Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.573909 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.574085 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.574138 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.573952 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:13 crc kubenswrapper[4804]: E0217 13:26:13.574253 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:13 crc kubenswrapper[4804]: E0217 13:26:13.574370 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:13 crc kubenswrapper[4804]: E0217 13:26:13.574517 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:13 crc kubenswrapper[4804]: E0217 13:26:13.574645 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.630353 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.630413 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.630439 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.630480 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.630513 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:13Z","lastTransitionTime":"2026-02-17T13:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.734367 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.734458 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.734485 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.734522 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.734547 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:13Z","lastTransitionTime":"2026-02-17T13:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.837984 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.838064 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.838083 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.838123 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.838146 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:13Z","lastTransitionTime":"2026-02-17T13:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.941535 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.941911 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.942058 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.942182 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.942444 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:13Z","lastTransitionTime":"2026-02-17T13:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.046769 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.046854 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.046878 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.046912 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.046931 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.150051 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.150173 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.150249 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.150289 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.150316 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.254143 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.254245 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.254257 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.254280 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.254292 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.358463 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.358551 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.358574 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.358603 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.358629 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.463442 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.463509 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.463534 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.463566 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.463592 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.544983 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 15:54:30.320974709 +0000 UTC Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.567357 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.567447 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.567473 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.567510 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.567535 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.575374 4804 scope.go:117] "RemoveContainer" containerID="95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.659597 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.659675 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.659703 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.659739 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.659764 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: E0217 13:26:14.692092 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:14Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.700674 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.700796 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.700838 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.700876 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.700902 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: E0217 13:26:14.725435 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:14Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.731517 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.731562 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.731579 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.731608 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.731627 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: E0217 13:26:14.754737 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:14Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.760603 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.760672 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.760699 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.760736 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.760764 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: E0217 13:26:14.786739 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:14Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.792849 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.792902 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.792921 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.792953 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.792974 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: E0217 13:26:14.817904 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:14Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:14 crc kubenswrapper[4804]: E0217 13:26:14.818358 4804 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.822657 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.822729 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.822743 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.822770 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.822785 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.925087 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.925119 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.925131 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.925151 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.925164 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.028971 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.029053 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.029081 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.029113 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.029132 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:15Z","lastTransitionTime":"2026-02-17T13:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.132560 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.132657 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.132669 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.132694 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.132708 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:15Z","lastTransitionTime":"2026-02-17T13:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.156687 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/1.log" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.160435 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701"} Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.161533 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.182115 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.201850 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.218956 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.235242 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.235296 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.235308 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.235327 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.235338 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:15Z","lastTransitionTime":"2026-02-17T13:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.237801 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.249876 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.267583 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.295294 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.313827 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.330041 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.337690 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.337745 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.337761 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.337786 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.337798 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:15Z","lastTransitionTime":"2026-02-17T13:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.345798 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.361079 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.375815 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.390904 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.408947 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.432344 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:00Z\\\",\\\"message\\\":\\\"ork-node-identity-vrzqb\\\\nI0217 13:25:59.797105 6248 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 13:25:59.796953 6248 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 13:25:59.797140 6248 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0217 13:25:59.797005 6248 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797326 6248 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797334 6248 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-kclvs in node crc\\\\nI0217 13:25:59.797340 6248 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-kclvs after 0 failed attempt(s)\\\\nI0217 13:25:59.797345 6248 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-kclvs\\\\nF0217 13:25:59.797308 6248 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.443620 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.443687 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.443701 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.443724 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.443738 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:15Z","lastTransitionTime":"2026-02-17T13:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.456571 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.545131 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 07:13:17.75859469 +0000 UTC Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.546617 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.546673 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.546687 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.546710 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.546722 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:15Z","lastTransitionTime":"2026-02-17T13:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.573408 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.573477 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.573512 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.573550 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:15 crc kubenswrapper[4804]: E0217 13:26:15.573593 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:15 crc kubenswrapper[4804]: E0217 13:26:15.573721 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:15 crc kubenswrapper[4804]: E0217 13:26:15.573824 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:15 crc kubenswrapper[4804]: E0217 13:26:15.574177 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.650413 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.650473 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.650487 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.650508 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.650522 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:15Z","lastTransitionTime":"2026-02-17T13:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.753780 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.753831 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.753844 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.753866 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.753881 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:15Z","lastTransitionTime":"2026-02-17T13:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.857962 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.858034 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.858052 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.858087 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.858108 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:15Z","lastTransitionTime":"2026-02-17T13:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.962800 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.962879 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.962905 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.962944 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.962968 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:15Z","lastTransitionTime":"2026-02-17T13:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.066485 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.066561 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.066583 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.066613 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.066633 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:16Z","lastTransitionTime":"2026-02-17T13:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.166657 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/2.log" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.167593 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/1.log" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.169424 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.169576 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.169607 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.169644 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.169700 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:16Z","lastTransitionTime":"2026-02-17T13:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.172392 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701" exitCode=1 Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.172496 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701"} Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.172593 4804 scope.go:117] "RemoveContainer" containerID="95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.174099 4804 scope.go:117] "RemoveContainer" containerID="e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701" Feb 17 13:26:16 crc kubenswrapper[4804]: E0217 13:26:16.174448 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.194863 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.221010 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.248817 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.274189 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.274256 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.274271 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.274295 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.274307 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:16Z","lastTransitionTime":"2026-02-17T13:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.285931 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:00Z\\\",\\\"message\\\":\\\"ork-node-identity-vrzqb\\\\nI0217 13:25:59.797105 6248 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 13:25:59.796953 6248 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 13:25:59.797140 6248 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0217 13:25:59.797005 6248 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797326 6248 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797334 6248 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-kclvs in node crc\\\\nI0217 13:25:59.797340 6248 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-kclvs after 0 failed attempt(s)\\\\nI0217 13:25:59.797345 6248 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-kclvs\\\\nF0217 13:25:59.797308 6248 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:15Z\\\",\\\"message\\\":\\\" 6456 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661364 6456 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661403 6456 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661438 6456 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.671636 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 13:26:15.671766 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 13:26:15.671804 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 13:26:15.671975 6456 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 13:26:15.671988 6456 factory.go:656] Stopping watch factory\\\\nI0217 13:26:15.690620 6456 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 13:26:15.690795 6456 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 13:26:15.690976 6456 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:26:15.691073 6456 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 13:26:15.691307 6456 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.307559 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.330714 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.348508 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.368262 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.376962 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.377057 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.377080 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.377109 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.377126 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:16Z","lastTransitionTime":"2026-02-17T13:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.388615 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.401177 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.414890 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.427558 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.440641 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.454527 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.467383 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.480121 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.480162 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.480170 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.480190 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.480222 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:16Z","lastTransitionTime":"2026-02-17T13:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.484685 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.545544 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 10:34:58.136003398 +0000 UTC Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.582784 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.582829 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.582841 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.582857 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.582868 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:16Z","lastTransitionTime":"2026-02-17T13:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.590468 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.610700 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:00Z\\\",\\\"message\\\":\\\"ork-node-identity-vrzqb\\\\nI0217 13:25:59.797105 6248 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 13:25:59.796953 6248 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 13:25:59.797140 6248 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0217 13:25:59.797005 6248 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797326 6248 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797334 6248 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-kclvs in node crc\\\\nI0217 13:25:59.797340 6248 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-kclvs after 0 failed attempt(s)\\\\nI0217 13:25:59.797345 6248 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-kclvs\\\\nF0217 13:25:59.797308 6248 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:15Z\\\",\\\"message\\\":\\\" 6456 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661364 6456 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661403 6456 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661438 6456 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.671636 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 13:26:15.671766 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 13:26:15.671804 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 13:26:15.671975 6456 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 13:26:15.671988 6456 factory.go:656] Stopping watch factory\\\\nI0217 13:26:15.690620 6456 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 13:26:15.690795 6456 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 13:26:15.690976 6456 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:26:15.691073 6456 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 13:26:15.691307 6456 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.632237 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.651830 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.669976 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.683931 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.685236 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.685364 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.685427 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.685583 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.685654 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:16Z","lastTransitionTime":"2026-02-17T13:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.697871 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.712262 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.724220 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.739759 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.755639 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.765652 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.784134 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.787899 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.788021 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.788084 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.788153 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.788226 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:16Z","lastTransitionTime":"2026-02-17T13:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.800963 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.815113 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.828646 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.892636 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.892685 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.892697 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.892720 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.892736 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:16Z","lastTransitionTime":"2026-02-17T13:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.995986 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.996063 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.996079 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.996107 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.996121 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:16Z","lastTransitionTime":"2026-02-17T13:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.100362 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.100450 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.100477 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.100515 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.100540 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:17Z","lastTransitionTime":"2026-02-17T13:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.178918 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/2.log" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.184623 4804 scope.go:117] "RemoveContainer" containerID="e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701" Feb 17 13:26:17 crc kubenswrapper[4804]: E0217 13:26:17.184987 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.203344 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.203389 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.203404 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.203428 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.203445 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:17Z","lastTransitionTime":"2026-02-17T13:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.208016 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.240908 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:15Z\\\",\\\"message\\\":\\\" 6456 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661364 6456 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661403 6456 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661438 6456 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.671636 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 13:26:15.671766 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 13:26:15.671804 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 13:26:15.671975 6456 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 13:26:15.671988 6456 factory.go:656] Stopping watch factory\\\\nI0217 13:26:15.690620 6456 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 13:26:15.690795 6456 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 13:26:15.690976 6456 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:26:15.691073 6456 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 13:26:15.691307 6456 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.257046 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.280885 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.296325 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.306921 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.307238 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.307400 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.307567 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.307706 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:17Z","lastTransitionTime":"2026-02-17T13:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.311622 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.323725 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.338091 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.355342 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.376478 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.393404 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.409736 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.410957 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.411086 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.411188 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.411302 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.411386 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:17Z","lastTransitionTime":"2026-02-17T13:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.421817 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.432374 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.443323 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.455842 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.515141 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.515285 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.515315 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.515355 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.515383 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:17Z","lastTransitionTime":"2026-02-17T13:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.546706 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 12:28:24.206636108 +0000 UTC Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.573304 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.573450 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:17 crc kubenswrapper[4804]: E0217 13:26:17.573522 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.573588 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:17 crc kubenswrapper[4804]: E0217 13:26:17.573699 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:17 crc kubenswrapper[4804]: E0217 13:26:17.573779 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.573916 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:17 crc kubenswrapper[4804]: E0217 13:26:17.574158 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.619392 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.619448 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.619461 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.619483 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.619499 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:17Z","lastTransitionTime":"2026-02-17T13:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.651034 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:17 crc kubenswrapper[4804]: E0217 13:26:17.651259 4804 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:17 crc kubenswrapper[4804]: E0217 13:26:17.651370 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs podName:e77722ba-d383-442c-b6dc-9983cf233257 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:33.651345226 +0000 UTC m=+67.762764753 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs") pod "network-metrics-daemon-4jfgm" (UID: "e77722ba-d383-442c-b6dc-9983cf233257") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.722923 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.723016 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.723051 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.723091 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.723147 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:17Z","lastTransitionTime":"2026-02-17T13:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.825782 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.825839 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.825853 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.825875 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.825890 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:17Z","lastTransitionTime":"2026-02-17T13:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.929210 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.929254 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.929264 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.929282 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.929292 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:17Z","lastTransitionTime":"2026-02-17T13:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.031911 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.032059 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.032070 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.032086 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.032097 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:18Z","lastTransitionTime":"2026-02-17T13:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.134539 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.134611 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.134629 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.134662 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.134682 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:18Z","lastTransitionTime":"2026-02-17T13:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.238183 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.238267 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.238278 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.238304 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.238318 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:18Z","lastTransitionTime":"2026-02-17T13:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.340945 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.340986 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.340998 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.341018 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.341032 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:18Z","lastTransitionTime":"2026-02-17T13:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.444400 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.444983 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.445397 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.445592 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.445739 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:18Z","lastTransitionTime":"2026-02-17T13:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.546960 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 23:04:42.089267373 +0000 UTC Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.549426 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.549482 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.549495 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.549520 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.549534 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:18Z","lastTransitionTime":"2026-02-17T13:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.652473 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.652536 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.652555 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.652581 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.652599 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:18Z","lastTransitionTime":"2026-02-17T13:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.754686 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.754862 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.754934 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.755007 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.755066 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:18Z","lastTransitionTime":"2026-02-17T13:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.860071 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.860182 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.860474 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.860609 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.860693 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:18Z","lastTransitionTime":"2026-02-17T13:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.861756 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.878776 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.891471 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:18Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.912692 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:18Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.928063 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:18Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.944067 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:18Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.961994 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:18Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.964366 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.964550 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.965155 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.965490 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.965905 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:18Z","lastTransitionTime":"2026-02-17T13:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.979981 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:18Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.992621 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:18Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.004226 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:19Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.019799 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:19Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.040470 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:15Z\\\",\\\"message\\\":\\\" 6456 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661364 6456 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661403 6456 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661438 6456 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.671636 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 13:26:15.671766 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 13:26:15.671804 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 13:26:15.671975 6456 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 13:26:15.671988 6456 factory.go:656] Stopping watch factory\\\\nI0217 13:26:15.690620 6456 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 13:26:15.690795 6456 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 13:26:15.690976 6456 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:26:15.691073 6456 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 13:26:15.691307 6456 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:19Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.053660 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:19Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.069460 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.069928 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.069959 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.069983 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.070028 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:19Z","lastTransitionTime":"2026-02-17T13:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.072571 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:19Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.087434 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:19Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.101333 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:19Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.115464 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:19Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.130538 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:19Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.173607 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.173660 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.173678 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.173700 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.173713 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:19Z","lastTransitionTime":"2026-02-17T13:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.270162 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.270445 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:26:51.270393882 +0000 UTC m=+85.381813279 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.271054 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.271280 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.271555 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.271772 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.271398 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.272132 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.272338 4804 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.272593 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:51.272571404 +0000 UTC m=+85.383990781 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.271507 4804 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.271810 4804 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.271929 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.273069 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:51.273034788 +0000 UTC m=+85.384454165 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.273335 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.273529 4804 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.273511 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:51.273479383 +0000 UTC m=+85.384898790 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.273891 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:51.273865075 +0000 UTC m=+85.385284532 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.276352 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.276428 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.276452 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.276489 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.276516 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:19Z","lastTransitionTime":"2026-02-17T13:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.380312 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.380375 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.380395 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.380423 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.380442 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:19Z","lastTransitionTime":"2026-02-17T13:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.484023 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.484118 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.484144 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.484233 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.484267 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:19Z","lastTransitionTime":"2026-02-17T13:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.548118 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 00:20:06.948393627 +0000 UTC Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.573716 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.573837 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.574011 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.574097 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.574254 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.574536 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.573872 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.574669 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.587329 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.587408 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.587422 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.587440 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.587490 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:19Z","lastTransitionTime":"2026-02-17T13:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.690217 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.690255 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.690263 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.690277 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.690285 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:19Z","lastTransitionTime":"2026-02-17T13:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.792960 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.793316 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.793441 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.793523 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.793592 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:19Z","lastTransitionTime":"2026-02-17T13:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.896247 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.896303 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.896319 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.896343 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.896359 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:19Z","lastTransitionTime":"2026-02-17T13:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.998755 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.998836 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.998848 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.998868 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.998880 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:19Z","lastTransitionTime":"2026-02-17T13:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.102291 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.102432 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.102462 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.102503 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.102551 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:20Z","lastTransitionTime":"2026-02-17T13:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.206033 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.206120 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.206139 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.206171 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.206191 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:20Z","lastTransitionTime":"2026-02-17T13:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.309826 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.310167 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.310262 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.310338 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.310419 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:20Z","lastTransitionTime":"2026-02-17T13:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.417890 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.417950 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.417966 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.417994 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.418013 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:20Z","lastTransitionTime":"2026-02-17T13:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.522110 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.522153 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.522164 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.522186 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.522218 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:20Z","lastTransitionTime":"2026-02-17T13:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.549495 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 00:22:13.200446638 +0000 UTC Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.625462 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.625703 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.625712 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.625730 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.625740 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:20Z","lastTransitionTime":"2026-02-17T13:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.729434 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.729496 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.729542 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.729574 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.729594 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:20Z","lastTransitionTime":"2026-02-17T13:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.832187 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.832254 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.832268 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.832290 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.832306 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:20Z","lastTransitionTime":"2026-02-17T13:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.935090 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.935144 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.935154 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.935175 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.935187 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:20Z","lastTransitionTime":"2026-02-17T13:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.038222 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.038279 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.038290 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.038312 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.038326 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:21Z","lastTransitionTime":"2026-02-17T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.140519 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.140563 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.140574 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.140595 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.140605 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:21Z","lastTransitionTime":"2026-02-17T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.243477 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.243527 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.243540 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.243563 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.243575 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:21Z","lastTransitionTime":"2026-02-17T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.346508 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.346560 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.346569 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.346591 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.346602 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:21Z","lastTransitionTime":"2026-02-17T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.449218 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.449254 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.449264 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.449282 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.449293 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:21Z","lastTransitionTime":"2026-02-17T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.550059 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 23:09:56.470438418 +0000 UTC Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.552478 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.552529 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.552557 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.552594 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.552613 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:21Z","lastTransitionTime":"2026-02-17T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.573742 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.573774 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.573787 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:21 crc kubenswrapper[4804]: E0217 13:26:21.573868 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.573742 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:21 crc kubenswrapper[4804]: E0217 13:26:21.574071 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:21 crc kubenswrapper[4804]: E0217 13:26:21.574263 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:21 crc kubenswrapper[4804]: E0217 13:26:21.574448 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.655531 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.655578 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.655593 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.655612 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.655625 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:21Z","lastTransitionTime":"2026-02-17T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.759213 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.759301 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.759316 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.759341 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.759355 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:21Z","lastTransitionTime":"2026-02-17T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.862350 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.862400 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.862412 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.862435 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.862451 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:21Z","lastTransitionTime":"2026-02-17T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.966283 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.966381 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.966403 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.966448 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.966469 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:21Z","lastTransitionTime":"2026-02-17T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.070176 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.070234 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.070248 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.070269 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.070282 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:22Z","lastTransitionTime":"2026-02-17T13:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.172976 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.173022 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.173034 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.173055 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.173070 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:22Z","lastTransitionTime":"2026-02-17T13:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.276424 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.276480 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.276495 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.276521 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.276538 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:22Z","lastTransitionTime":"2026-02-17T13:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.379501 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.379550 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.379560 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.379579 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.379590 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:22Z","lastTransitionTime":"2026-02-17T13:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.482362 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.482405 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.482417 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.482436 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.482446 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:22Z","lastTransitionTime":"2026-02-17T13:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.550944 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 10:50:11.10991498 +0000 UTC Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.584828 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.584884 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.584896 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.584924 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.584939 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:22Z","lastTransitionTime":"2026-02-17T13:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.692437 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.692468 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.692476 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.692494 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.692504 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:22Z","lastTransitionTime":"2026-02-17T13:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.795258 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.795306 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.795319 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.795341 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.795353 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:22Z","lastTransitionTime":"2026-02-17T13:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.898439 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.898555 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.898587 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.898628 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.898652 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:22Z","lastTransitionTime":"2026-02-17T13:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.001948 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.002018 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.002041 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.002079 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.002101 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:23Z","lastTransitionTime":"2026-02-17T13:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.106052 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.106114 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.106131 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.106154 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.106170 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:23Z","lastTransitionTime":"2026-02-17T13:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.208265 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.208308 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.208317 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.208335 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.208347 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:23Z","lastTransitionTime":"2026-02-17T13:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.312120 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.312178 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.312189 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.312234 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.312254 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:23Z","lastTransitionTime":"2026-02-17T13:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.416141 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.416188 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.416221 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.416243 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.416256 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:23Z","lastTransitionTime":"2026-02-17T13:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.520461 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.520545 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.520565 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.520593 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.520610 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:23Z","lastTransitionTime":"2026-02-17T13:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.551933 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 03:51:05.525021297 +0000 UTC Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.573450 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.573579 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.573625 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.573688 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:23 crc kubenswrapper[4804]: E0217 13:26:23.573718 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:23 crc kubenswrapper[4804]: E0217 13:26:23.573878 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:23 crc kubenswrapper[4804]: E0217 13:26:23.573973 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:23 crc kubenswrapper[4804]: E0217 13:26:23.574074 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.623289 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.623338 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.623350 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.623370 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.623383 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:23Z","lastTransitionTime":"2026-02-17T13:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.726858 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.726973 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.727006 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.727068 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.727098 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:23Z","lastTransitionTime":"2026-02-17T13:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.832888 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.832942 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.832953 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.832970 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.832982 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:23Z","lastTransitionTime":"2026-02-17T13:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.935852 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.936126 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.936245 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.936389 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.936466 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:23Z","lastTransitionTime":"2026-02-17T13:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.039775 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.039828 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.039840 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.039864 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.039879 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:24Z","lastTransitionTime":"2026-02-17T13:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.143301 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.143360 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.143379 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.143407 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.143423 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:24Z","lastTransitionTime":"2026-02-17T13:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.246846 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.246925 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.246952 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.246986 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.247009 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:24Z","lastTransitionTime":"2026-02-17T13:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.350368 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.350468 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.350503 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.350535 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.350556 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:24Z","lastTransitionTime":"2026-02-17T13:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.454861 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.454925 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.454943 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.454973 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.454996 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:24Z","lastTransitionTime":"2026-02-17T13:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.552930 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 01:26:45.856957319 +0000 UTC Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.558512 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.558560 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.558572 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.558594 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.558613 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:24Z","lastTransitionTime":"2026-02-17T13:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.662467 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.662568 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.662580 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.662598 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.662610 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:24Z","lastTransitionTime":"2026-02-17T13:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.821620 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.821715 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.821746 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.821793 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.821813 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:24Z","lastTransitionTime":"2026-02-17T13:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.924854 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.924941 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.924961 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.924997 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.925024 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:24Z","lastTransitionTime":"2026-02-17T13:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.025626 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.025686 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.025700 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.025723 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.025737 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: E0217 13:26:25.053388 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:25Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.059421 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.059490 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.059507 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.059542 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.059589 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: E0217 13:26:25.076059 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:25Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.082730 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.082790 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.082808 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.082833 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.082881 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: E0217 13:26:25.100245 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:25Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.105915 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.105969 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.105982 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.106004 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.106040 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: E0217 13:26:25.123117 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:25Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.127766 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.127829 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.127843 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.127865 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.127902 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: E0217 13:26:25.143991 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:25Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:25 crc kubenswrapper[4804]: E0217 13:26:25.144107 4804 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.146093 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.146117 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.146126 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.146143 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.146153 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.248470 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.248523 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.248538 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.248560 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.248574 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.351364 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.351427 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.351442 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.351466 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.351483 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.453770 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.453941 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.453971 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.454054 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.454127 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.553322 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 10:15:05.156920557 +0000 UTC Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.558516 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.558587 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.558600 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.558620 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.558632 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.574291 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.574306 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:25 crc kubenswrapper[4804]: E0217 13:26:25.574604 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:25 crc kubenswrapper[4804]: E0217 13:26:25.574649 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.574302 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.574309 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:25 crc kubenswrapper[4804]: E0217 13:26:25.574753 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:25 crc kubenswrapper[4804]: E0217 13:26:25.574940 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.661341 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.661389 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.661400 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.661421 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.661433 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.765842 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.765912 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.765933 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.765961 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.765977 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.869109 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.869239 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.869280 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.869319 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.869351 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.972101 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.972145 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.972158 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.972181 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.972219 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.075715 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.075795 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.075817 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.075852 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.075876 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:26Z","lastTransitionTime":"2026-02-17T13:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.179991 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.180102 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.180131 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.180178 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.180262 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:26Z","lastTransitionTime":"2026-02-17T13:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.283979 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.284063 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.284081 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.284112 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.284136 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:26Z","lastTransitionTime":"2026-02-17T13:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.387001 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.387097 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.387125 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.387163 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.387191 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:26Z","lastTransitionTime":"2026-02-17T13:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.491271 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.491350 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.491366 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.491394 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.491410 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:26Z","lastTransitionTime":"2026-02-17T13:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.553747 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 02:26:51.707562936 +0000 UTC Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.600172 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.606536 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.606598 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.606613 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.606635 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.606649 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:26Z","lastTransitionTime":"2026-02-17T13:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.615863 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.632277 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.650685 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.665924 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.681364 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40bdd16b-bf07-4c36-80dd-66372f796f39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbdcf29ae01c66c5b1c3ed9c4aa7b3a6451bb49e10d52c319a2744105303386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b261eeb544e51ee2adb543542186c18759e667e1339fd724a3dce82b0391aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8a58720e24f8e51cc0749b46bb8ae7a1ecd576acaf200cce798a109e2e46fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.697865 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.709103 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.709152 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.709162 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.709182 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.709210 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:26Z","lastTransitionTime":"2026-02-17T13:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.712840 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.729816 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.746987 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.773417 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:15Z\\\",\\\"message\\\":\\\" 6456 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661364 6456 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661403 6456 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661438 6456 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.671636 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 13:26:15.671766 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 13:26:15.671804 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 13:26:15.671975 6456 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 13:26:15.671988 6456 factory.go:656] Stopping watch factory\\\\nI0217 13:26:15.690620 6456 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 13:26:15.690795 6456 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 13:26:15.690976 6456 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:26:15.691073 6456 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 13:26:15.691307 6456 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.791413 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.805354 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.811710 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.811811 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.811841 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.811876 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.811904 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:26Z","lastTransitionTime":"2026-02-17T13:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.828150 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.844489 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.859344 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.873033 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.915277 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.915332 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.915343 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.915363 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.915378 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:26Z","lastTransitionTime":"2026-02-17T13:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.018357 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.018645 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.018769 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.018837 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.018927 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:27Z","lastTransitionTime":"2026-02-17T13:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.122666 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.122719 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.122736 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.122760 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.122777 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:27Z","lastTransitionTime":"2026-02-17T13:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.225025 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.225110 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.225134 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.225166 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.225187 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:27Z","lastTransitionTime":"2026-02-17T13:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.329368 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.329469 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.329496 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.329536 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.329559 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:27Z","lastTransitionTime":"2026-02-17T13:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.434135 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.434250 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.434272 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.434307 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.434327 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:27Z","lastTransitionTime":"2026-02-17T13:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.538676 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.538861 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.538951 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.538986 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.539011 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:27Z","lastTransitionTime":"2026-02-17T13:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.554383 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 16:27:22.639784046 +0000 UTC Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.573534 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.573534 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.573553 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.573680 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:27 crc kubenswrapper[4804]: E0217 13:26:27.573876 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:27 crc kubenswrapper[4804]: E0217 13:26:27.574179 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:27 crc kubenswrapper[4804]: E0217 13:26:27.574383 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:27 crc kubenswrapper[4804]: E0217 13:26:27.574499 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.642299 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.642399 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.642418 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.642448 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.642466 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:27Z","lastTransitionTime":"2026-02-17T13:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.745786 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.745867 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.745888 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.745938 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.745985 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:27Z","lastTransitionTime":"2026-02-17T13:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.850965 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.851325 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.851504 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.852345 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.852373 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:27Z","lastTransitionTime":"2026-02-17T13:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.956721 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.956786 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.956809 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.956838 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.956856 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:27Z","lastTransitionTime":"2026-02-17T13:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.060188 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.060278 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.060299 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.060328 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.060352 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:28Z","lastTransitionTime":"2026-02-17T13:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.163545 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.163604 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.163626 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.163652 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.163670 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:28Z","lastTransitionTime":"2026-02-17T13:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.266715 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.266784 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.266793 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.266810 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.266821 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:28Z","lastTransitionTime":"2026-02-17T13:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.369628 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.369686 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.369701 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.369723 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.369736 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:28Z","lastTransitionTime":"2026-02-17T13:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.473658 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.473726 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.473745 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.473773 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.473793 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:28Z","lastTransitionTime":"2026-02-17T13:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.554810 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 17:56:39.714316412 +0000 UTC Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.576523 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.576616 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.576641 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.576664 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.576681 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:28Z","lastTransitionTime":"2026-02-17T13:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.577358 4804 scope.go:117] "RemoveContainer" containerID="e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701" Feb 17 13:26:28 crc kubenswrapper[4804]: E0217 13:26:28.577619 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.679630 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.679689 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.679703 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.679726 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.679740 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:28Z","lastTransitionTime":"2026-02-17T13:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.782685 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.782741 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.782752 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.782773 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.782784 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:28Z","lastTransitionTime":"2026-02-17T13:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.884810 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.884859 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.884870 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.884889 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.884910 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:28Z","lastTransitionTime":"2026-02-17T13:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.988686 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.988781 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.988813 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.988846 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.988868 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:28Z","lastTransitionTime":"2026-02-17T13:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.091368 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.091423 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.091435 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.091473 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.091492 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:29Z","lastTransitionTime":"2026-02-17T13:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.194364 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.194404 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.194415 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.194431 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.194441 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:29Z","lastTransitionTime":"2026-02-17T13:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.296920 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.296975 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.296991 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.297013 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.297026 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:29Z","lastTransitionTime":"2026-02-17T13:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.399865 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.399903 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.399912 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.399928 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.399938 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:29Z","lastTransitionTime":"2026-02-17T13:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.502229 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.502269 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.502280 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.502297 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.502310 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:29Z","lastTransitionTime":"2026-02-17T13:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.554924 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 20:31:24.398254821 +0000 UTC Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.573352 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.573434 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.573444 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.573447 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:29 crc kubenswrapper[4804]: E0217 13:26:29.573621 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:29 crc kubenswrapper[4804]: E0217 13:26:29.573875 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:29 crc kubenswrapper[4804]: E0217 13:26:29.574020 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:29 crc kubenswrapper[4804]: E0217 13:26:29.574160 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.604896 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.604953 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.604974 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.604995 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.605010 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:29Z","lastTransitionTime":"2026-02-17T13:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.708292 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.708348 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.708360 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.708381 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.708392 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:29Z","lastTransitionTime":"2026-02-17T13:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.811351 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.811427 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.811443 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.811470 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.811486 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:29Z","lastTransitionTime":"2026-02-17T13:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.914758 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.914827 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.914844 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.914869 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.914883 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:29Z","lastTransitionTime":"2026-02-17T13:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.017949 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.018012 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.018027 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.018048 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.018060 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:30Z","lastTransitionTime":"2026-02-17T13:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.120277 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.120317 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.120327 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.120345 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.120358 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:30Z","lastTransitionTime":"2026-02-17T13:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.222848 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.222915 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.222927 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.222945 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.222957 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:30Z","lastTransitionTime":"2026-02-17T13:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.326408 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.326468 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.326478 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.326499 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.326514 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:30Z","lastTransitionTime":"2026-02-17T13:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.429575 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.429627 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.429639 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.429662 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.429673 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:30Z","lastTransitionTime":"2026-02-17T13:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.532519 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.532565 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.532575 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.532596 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.532606 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:30Z","lastTransitionTime":"2026-02-17T13:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.556060 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 12:45:51.069984866 +0000 UTC Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.635134 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.635187 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.635225 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.635247 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.635262 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:30Z","lastTransitionTime":"2026-02-17T13:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.738503 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.738559 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.738572 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.738595 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.738608 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:30Z","lastTransitionTime":"2026-02-17T13:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.842018 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.842077 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.842091 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.842115 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.842128 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:30Z","lastTransitionTime":"2026-02-17T13:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.948781 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.948840 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.948865 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.948892 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.948908 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:30Z","lastTransitionTime":"2026-02-17T13:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.055451 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.055767 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.055885 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.055959 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.056026 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:31Z","lastTransitionTime":"2026-02-17T13:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.159405 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.159781 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.159995 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.160102 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.160187 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:31Z","lastTransitionTime":"2026-02-17T13:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.262809 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.262886 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.262898 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.262937 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.262951 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:31Z","lastTransitionTime":"2026-02-17T13:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.366311 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.366361 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.366374 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.366395 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.366410 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:31Z","lastTransitionTime":"2026-02-17T13:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.470163 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.470255 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.470271 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.470296 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.470314 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:31Z","lastTransitionTime":"2026-02-17T13:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.556370 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 02:22:40.198698764 +0000 UTC Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.573484 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.573528 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.573667 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:31 crc kubenswrapper[4804]: E0217 13:26:31.573770 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.573815 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.573849 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.573868 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.573894 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.573909 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:31Z","lastTransitionTime":"2026-02-17T13:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:31 crc kubenswrapper[4804]: E0217 13:26:31.573990 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.574078 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:31 crc kubenswrapper[4804]: E0217 13:26:31.574107 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:31 crc kubenswrapper[4804]: E0217 13:26:31.574301 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.676955 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.677008 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.677018 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.677033 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.677043 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:31Z","lastTransitionTime":"2026-02-17T13:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.779502 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.779548 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.779561 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.779579 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.779592 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:31Z","lastTransitionTime":"2026-02-17T13:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.882120 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.882166 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.882178 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.882225 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.882243 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:31Z","lastTransitionTime":"2026-02-17T13:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.986132 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.986241 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.986273 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.986302 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.986324 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:31Z","lastTransitionTime":"2026-02-17T13:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.090012 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.090076 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.090089 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.090113 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.090125 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:32Z","lastTransitionTime":"2026-02-17T13:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.193402 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.193457 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.193473 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.193492 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.193506 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:32Z","lastTransitionTime":"2026-02-17T13:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.296012 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.296063 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.296078 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.296105 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.296118 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:32Z","lastTransitionTime":"2026-02-17T13:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.399695 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.399784 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.399799 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.399821 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.399836 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:32Z","lastTransitionTime":"2026-02-17T13:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.503539 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.503586 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.503597 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.503619 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.503632 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:32Z","lastTransitionTime":"2026-02-17T13:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.556585 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 02:16:08.050937545 +0000 UTC Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.607093 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.607157 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.607176 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.607236 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.607261 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:32Z","lastTransitionTime":"2026-02-17T13:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.709841 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.709931 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.709951 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.709981 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.710000 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:32Z","lastTransitionTime":"2026-02-17T13:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.812902 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.812969 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.812988 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.813020 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.813039 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:32Z","lastTransitionTime":"2026-02-17T13:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.915838 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.915897 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.915911 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.915934 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.915951 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:32Z","lastTransitionTime":"2026-02-17T13:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.018978 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.019026 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.019040 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.019060 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.019073 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:33Z","lastTransitionTime":"2026-02-17T13:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.122580 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.122620 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.122635 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.122655 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.122668 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:33Z","lastTransitionTime":"2026-02-17T13:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.227364 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.227489 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.227505 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.227574 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.227590 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:33Z","lastTransitionTime":"2026-02-17T13:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.330571 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.330642 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.330660 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.330692 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.330711 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:33Z","lastTransitionTime":"2026-02-17T13:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.434365 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.434469 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.434523 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.434576 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.434707 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:33Z","lastTransitionTime":"2026-02-17T13:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.538857 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.538925 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.538941 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.538971 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.538984 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:33Z","lastTransitionTime":"2026-02-17T13:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.557417 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 16:37:48.081921087 +0000 UTC Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.573838 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.573896 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.573855 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:33 crc kubenswrapper[4804]: E0217 13:26:33.574105 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.574132 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:33 crc kubenswrapper[4804]: E0217 13:26:33.574297 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:33 crc kubenswrapper[4804]: E0217 13:26:33.574662 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:33 crc kubenswrapper[4804]: E0217 13:26:33.574770 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.641310 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.641345 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.641357 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.641374 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.641385 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:33Z","lastTransitionTime":"2026-02-17T13:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.743741 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.743805 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.743817 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.743838 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.743850 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:33Z","lastTransitionTime":"2026-02-17T13:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:33 crc kubenswrapper[4804]: E0217 13:26:33.743999 4804 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:33 crc kubenswrapper[4804]: E0217 13:26:33.744073 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs podName:e77722ba-d383-442c-b6dc-9983cf233257 nodeName:}" failed. No retries permitted until 2026-02-17 13:27:05.744054928 +0000 UTC m=+99.855474275 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs") pod "network-metrics-daemon-4jfgm" (UID: "e77722ba-d383-442c-b6dc-9983cf233257") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.743859 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.846557 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.846604 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.846617 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.846641 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.846656 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:33Z","lastTransitionTime":"2026-02-17T13:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.949184 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.949730 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.949987 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.950069 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.950139 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:33Z","lastTransitionTime":"2026-02-17T13:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.053282 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.053620 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.053681 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.053791 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.053866 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:34Z","lastTransitionTime":"2026-02-17T13:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.156470 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.156521 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.156534 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.156558 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.156570 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:34Z","lastTransitionTime":"2026-02-17T13:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.258667 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.258725 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.258740 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.258767 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.258783 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:34Z","lastTransitionTime":"2026-02-17T13:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.361744 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.362172 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.362357 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.362519 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.362734 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:34Z","lastTransitionTime":"2026-02-17T13:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.467974 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.468022 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.468033 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.468055 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.468066 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:34Z","lastTransitionTime":"2026-02-17T13:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.558601 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 20:37:33.509870398 +0000 UTC Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.570841 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.570901 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.570916 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.570944 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.570959 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:34Z","lastTransitionTime":"2026-02-17T13:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.674264 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.674340 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.674380 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.674403 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.674416 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:34Z","lastTransitionTime":"2026-02-17T13:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.777408 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.777459 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.777471 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.777491 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.777507 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:34Z","lastTransitionTime":"2026-02-17T13:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.880523 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.880570 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.880587 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.880608 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.880619 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:34Z","lastTransitionTime":"2026-02-17T13:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.984043 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.984099 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.984113 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.984137 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.984151 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:34Z","lastTransitionTime":"2026-02-17T13:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.091568 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.091616 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.091629 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.091649 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.091662 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.193913 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.193958 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.193970 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.193989 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.194000 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.296139 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.296219 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.296233 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.296251 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.296265 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.399136 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.399291 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.399315 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.399352 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.399391 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.464868 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.464920 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.464934 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.464958 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.464973 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: E0217 13:26:35.478573 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:35Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.483940 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.483989 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.483999 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.484017 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.484027 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: E0217 13:26:35.497452 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:35Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.501356 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.501651 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.501833 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.502215 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.502755 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: E0217 13:26:35.517127 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:35Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.521505 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.521637 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.521712 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.521794 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.521865 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: E0217 13:26:35.536694 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:35Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.542106 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.542149 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.542160 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.542192 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.542231 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.559192 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 14:51:23.943957618 +0000 UTC Feb 17 13:26:35 crc kubenswrapper[4804]: E0217 13:26:35.559899 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:35Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:35 crc kubenswrapper[4804]: E0217 13:26:35.560070 4804 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.562372 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.562434 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.562450 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.562476 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.562493 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.572999 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.572999 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:35 crc kubenswrapper[4804]: E0217 13:26:35.573176 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.573013 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:35 crc kubenswrapper[4804]: E0217 13:26:35.573274 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:35 crc kubenswrapper[4804]: E0217 13:26:35.573344 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.573029 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:35 crc kubenswrapper[4804]: E0217 13:26:35.573457 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.665969 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.666053 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.666079 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.666115 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.666140 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.769366 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.769413 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.769425 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.769444 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.769456 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.872104 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.872146 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.872156 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.872174 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.872185 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.975503 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.975551 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.975566 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.975590 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.975606 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.078303 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.078353 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.078362 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.078384 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.078396 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:36Z","lastTransitionTime":"2026-02-17T13:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.181389 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.181452 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.181465 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.181485 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.181497 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:36Z","lastTransitionTime":"2026-02-17T13:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.255412 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/0.log" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.255468 4804 generic.go:334] "Generic (PLEG): container finished" podID="42eec48d-c990-43e6-8348-d9f78997ec3b" containerID="26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa" exitCode=1 Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.255511 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kclvs" event={"ID":"42eec48d-c990-43e6-8348-d9f78997ec3b","Type":"ContainerDied","Data":"26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa"} Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.256033 4804 scope.go:117] "RemoveContainer" containerID="26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.273089 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.285750 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.286065 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.286094 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.286177 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.286328 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:36Z","lastTransitionTime":"2026-02-17T13:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.301662 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:15Z\\\",\\\"message\\\":\\\" 6456 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661364 6456 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661403 6456 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661438 6456 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.671636 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 13:26:15.671766 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 13:26:15.671804 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 13:26:15.671975 6456 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 13:26:15.671988 6456 factory.go:656] Stopping watch factory\\\\nI0217 13:26:15.690620 6456 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 13:26:15.690795 6456 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 13:26:15.690976 6456 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:26:15.691073 6456 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 13:26:15.691307 6456 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.314692 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.332056 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.348693 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.360656 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.373707 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.390033 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.390071 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.390081 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.390101 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.390111 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:36Z","lastTransitionTime":"2026-02-17T13:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.390883 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.405260 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40bdd16b-bf07-4c36-80dd-66372f796f39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbdcf29ae01c66c5b1c3ed9c4aa7b3a6451bb49e10d52c319a2744105303386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b261eeb544e51ee2adb543542186c18759e667e1339fd724a3dce82b0391aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8a58720e24f8e51cc0749b46bb8ae7a1ecd576acaf200cce798a109e2e46fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.420054 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.439895 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.458308 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.486550 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"2026-02-17T13:25:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307\\\\n2026-02-17T13:25:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307 to /host/opt/cni/bin/\\\\n2026-02-17T13:25:50Z [verbose] multus-daemon started\\\\n2026-02-17T13:25:50Z [verbose] Readiness Indicator file check\\\\n2026-02-17T13:26:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.492471 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.492511 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.492521 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.492537 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.492550 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:36Z","lastTransitionTime":"2026-02-17T13:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.512162 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.527939 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.540840 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.553496 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.559756 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 20:20:01.229515341 +0000 UTC Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.589461 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.595491 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.595540 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.595553 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.595575 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.595590 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:36Z","lastTransitionTime":"2026-02-17T13:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.611122 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:15Z\\\",\\\"message\\\":\\\" 6456 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661364 6456 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661403 6456 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661438 6456 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.671636 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 13:26:15.671766 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 13:26:15.671804 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 13:26:15.671975 6456 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 13:26:15.671988 6456 factory.go:656] Stopping watch factory\\\\nI0217 13:26:15.690620 6456 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 13:26:15.690795 6456 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 13:26:15.690976 6456 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:26:15.691073 6456 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 13:26:15.691307 6456 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.625838 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.646176 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.659401 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.671663 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.686977 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.697386 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.697414 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.697425 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.697442 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.697452 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:36Z","lastTransitionTime":"2026-02-17T13:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.704953 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.718682 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40bdd16b-bf07-4c36-80dd-66372f796f39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbdcf29ae01c66c5b1c3ed9c4aa7b3a6451bb49e10d52c319a2744105303386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b261eeb544e51ee2adb543542186c18759e667e1339fd724a3dce82b0391aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8a58720e24f8e51cc0749b46bb8ae7a1ecd576acaf200cce798a109e2e46fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.732304 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.747334 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.762029 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.776775 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"2026-02-17T13:25:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307\\\\n2026-02-17T13:25:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307 to /host/opt/cni/bin/\\\\n2026-02-17T13:25:50Z [verbose] multus-daemon started\\\\n2026-02-17T13:25:50Z [verbose] Readiness Indicator file check\\\\n2026-02-17T13:26:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.793594 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.799509 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.799562 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.799578 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.799602 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.799617 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:36Z","lastTransitionTime":"2026-02-17T13:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.810826 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.825825 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.841362 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.902089 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.902171 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.902184 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.902222 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.902233 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:36Z","lastTransitionTime":"2026-02-17T13:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.005351 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.005409 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.005422 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.005443 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.005453 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:37Z","lastTransitionTime":"2026-02-17T13:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.108247 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.108285 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.108295 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.108309 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.108324 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:37Z","lastTransitionTime":"2026-02-17T13:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.210770 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.210825 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.210835 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.210854 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.210866 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:37Z","lastTransitionTime":"2026-02-17T13:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.260991 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/0.log" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.261071 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kclvs" event={"ID":"42eec48d-c990-43e6-8348-d9f78997ec3b","Type":"ContainerStarted","Data":"2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a"} Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.275607 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.288541 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.300161 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.314605 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.314643 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.314656 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.314675 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.314688 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:37Z","lastTransitionTime":"2026-02-17T13:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.349833 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:15Z\\\",\\\"message\\\":\\\" 6456 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661364 6456 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661403 6456 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661438 6456 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.671636 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 13:26:15.671766 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 13:26:15.671804 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 13:26:15.671975 6456 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 13:26:15.671988 6456 factory.go:656] Stopping watch factory\\\\nI0217 13:26:15.690620 6456 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 13:26:15.690795 6456 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 13:26:15.690976 6456 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:26:15.691073 6456 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 13:26:15.691307 6456 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.366075 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.383932 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.397516 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.410158 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.417431 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.417671 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.417857 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.418335 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.418566 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:37Z","lastTransitionTime":"2026-02-17T13:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.421153 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.432037 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.444036 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.460473 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"2026-02-17T13:25:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307\\\\n2026-02-17T13:25:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307 to /host/opt/cni/bin/\\\\n2026-02-17T13:25:50Z [verbose] multus-daemon started\\\\n2026-02-17T13:25:50Z [verbose] Readiness Indicator file check\\\\n2026-02-17T13:26:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.478023 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.491566 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.503802 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40bdd16b-bf07-4c36-80dd-66372f796f39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbdcf29ae01c66c5b1c3ed9c4aa7b3a6451bb49e10d52c319a2744105303386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b261eeb544e51ee2adb543542186c18759e667e1339fd724a3dce82b0391aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8a58720e24f8e51cc0749b46bb8ae7a1ecd576acaf200cce798a109e2e46fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.516017 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.525491 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.525541 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.525552 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.525576 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.525592 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:37Z","lastTransitionTime":"2026-02-17T13:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.528930 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.560436 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 03:58:27.406434549 +0000 UTC Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.573834 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.573914 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.573856 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.573987 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:37 crc kubenswrapper[4804]: E0217 13:26:37.574030 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:37 crc kubenswrapper[4804]: E0217 13:26:37.574147 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:37 crc kubenswrapper[4804]: E0217 13:26:37.574436 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:37 crc kubenswrapper[4804]: E0217 13:26:37.574554 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.627715 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.627765 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.627774 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.627789 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.627798 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:37Z","lastTransitionTime":"2026-02-17T13:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.788729 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.788816 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.788847 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.788884 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.789094 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:37Z","lastTransitionTime":"2026-02-17T13:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.891905 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.891947 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.891963 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.891980 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.891989 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:37Z","lastTransitionTime":"2026-02-17T13:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.995647 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.996014 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.996102 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.996230 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.996327 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:37Z","lastTransitionTime":"2026-02-17T13:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.099018 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.099062 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.099072 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.099090 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.099101 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:38Z","lastTransitionTime":"2026-02-17T13:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.200714 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.200757 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.200772 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.200789 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.200801 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:38Z","lastTransitionTime":"2026-02-17T13:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.302803 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.302847 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.302857 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.302875 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.302886 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:38Z","lastTransitionTime":"2026-02-17T13:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.406193 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.406840 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.406876 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.406902 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.406916 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:38Z","lastTransitionTime":"2026-02-17T13:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.510889 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.510955 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.510968 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.510995 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.511011 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:38Z","lastTransitionTime":"2026-02-17T13:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.561224 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 09:51:55.920514478 +0000 UTC Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.614238 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.614293 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.614306 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.614328 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.614347 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:38Z","lastTransitionTime":"2026-02-17T13:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.717161 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.717623 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.717728 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.717824 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.717927 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:38Z","lastTransitionTime":"2026-02-17T13:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.820530 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.820644 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.820658 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.820676 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.820686 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:38Z","lastTransitionTime":"2026-02-17T13:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.923336 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.923389 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.923400 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.923420 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.923435 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:38Z","lastTransitionTime":"2026-02-17T13:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.026521 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.026564 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.026577 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.026596 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.026606 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:39Z","lastTransitionTime":"2026-02-17T13:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.128953 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.129004 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.129013 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.129032 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.129045 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:39Z","lastTransitionTime":"2026-02-17T13:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.231503 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.231584 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.231597 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.231620 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.231637 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:39Z","lastTransitionTime":"2026-02-17T13:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.334092 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.334150 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.334162 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.334183 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.334210 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:39Z","lastTransitionTime":"2026-02-17T13:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.437535 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.437591 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.437604 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.437627 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.437641 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:39Z","lastTransitionTime":"2026-02-17T13:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.540488 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.540541 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.540552 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.540575 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.540592 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:39Z","lastTransitionTime":"2026-02-17T13:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.562942 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 04:38:24.297527462 +0000 UTC Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.573369 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.573369 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.573369 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:39 crc kubenswrapper[4804]: E0217 13:26:39.573597 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.573395 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:39 crc kubenswrapper[4804]: E0217 13:26:39.573690 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:39 crc kubenswrapper[4804]: E0217 13:26:39.573760 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:39 crc kubenswrapper[4804]: E0217 13:26:39.573823 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.643282 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.643342 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.643355 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.643378 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.643396 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:39Z","lastTransitionTime":"2026-02-17T13:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.746422 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.746478 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.746491 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.746521 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.746535 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:39Z","lastTransitionTime":"2026-02-17T13:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.849235 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.849277 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.849287 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.849306 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.849316 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:39Z","lastTransitionTime":"2026-02-17T13:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.952091 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.952129 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.952137 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.952153 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.952162 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:39Z","lastTransitionTime":"2026-02-17T13:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.054888 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.054943 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.054953 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.054974 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.054984 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:40Z","lastTransitionTime":"2026-02-17T13:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.158862 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.158926 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.158942 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.158968 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.158991 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:40Z","lastTransitionTime":"2026-02-17T13:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.261971 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.262415 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.262464 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.262499 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.262520 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:40Z","lastTransitionTime":"2026-02-17T13:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.365300 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.365374 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.365405 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.365453 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.365484 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:40Z","lastTransitionTime":"2026-02-17T13:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.468895 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.468957 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.468968 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.468987 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.468997 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:40Z","lastTransitionTime":"2026-02-17T13:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.563332 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 08:34:47.779070346 +0000 UTC Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.571817 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.571857 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.571870 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.571896 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.571907 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:40Z","lastTransitionTime":"2026-02-17T13:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.674726 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.674782 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.674793 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.674819 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.674833 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:40Z","lastTransitionTime":"2026-02-17T13:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.777550 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.777610 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.777621 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.777642 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.777654 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:40Z","lastTransitionTime":"2026-02-17T13:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.880925 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.880988 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.881006 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.881034 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.881053 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:40Z","lastTransitionTime":"2026-02-17T13:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.985263 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.985329 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.985343 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.985369 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.985389 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:40Z","lastTransitionTime":"2026-02-17T13:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.087873 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.088182 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.088261 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.088394 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.088488 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:41Z","lastTransitionTime":"2026-02-17T13:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.191517 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.191585 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.191597 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.191620 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.191636 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:41Z","lastTransitionTime":"2026-02-17T13:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.295377 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.295444 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.295457 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.295481 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.295499 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:41Z","lastTransitionTime":"2026-02-17T13:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.398047 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.398343 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.398445 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.398526 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.398599 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:41Z","lastTransitionTime":"2026-02-17T13:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.500969 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.501023 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.501033 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.501056 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.501067 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:41Z","lastTransitionTime":"2026-02-17T13:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.563972 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 05:44:24.886720453 +0000 UTC Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.573343 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.573414 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.573344 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:41 crc kubenswrapper[4804]: E0217 13:26:41.573515 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.573671 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:41 crc kubenswrapper[4804]: E0217 13:26:41.573706 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:41 crc kubenswrapper[4804]: E0217 13:26:41.573973 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:41 crc kubenswrapper[4804]: E0217 13:26:41.574251 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.604061 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.604122 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.604146 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.604431 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.604464 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:41Z","lastTransitionTime":"2026-02-17T13:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.707749 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.708153 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.708583 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.708994 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.709331 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:41Z","lastTransitionTime":"2026-02-17T13:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.813377 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.813459 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.813481 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.813510 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.813529 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:41Z","lastTransitionTime":"2026-02-17T13:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.917061 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.917161 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.917181 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.917234 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.917253 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:41Z","lastTransitionTime":"2026-02-17T13:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.020693 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.020752 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.020767 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.020792 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.020808 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:42Z","lastTransitionTime":"2026-02-17T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.124543 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.124589 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.124601 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.124622 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.124633 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:42Z","lastTransitionTime":"2026-02-17T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.227740 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.227808 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.227830 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.227856 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.227873 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:42Z","lastTransitionTime":"2026-02-17T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.331101 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.331166 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.331178 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.331215 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.331228 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:42Z","lastTransitionTime":"2026-02-17T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.434629 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.434702 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.434725 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.434779 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.434804 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:42Z","lastTransitionTime":"2026-02-17T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.537727 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.537804 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.537817 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.537842 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.537859 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:42Z","lastTransitionTime":"2026-02-17T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.564869 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 18:58:00.538066844 +0000 UTC Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.575325 4804 scope.go:117] "RemoveContainer" containerID="e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.643582 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.643621 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.643630 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.643648 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.643659 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:42Z","lastTransitionTime":"2026-02-17T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.747005 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.747067 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.747078 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.747100 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.747119 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:42Z","lastTransitionTime":"2026-02-17T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.849313 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.849357 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.849368 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.849386 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.849397 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:42Z","lastTransitionTime":"2026-02-17T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.953699 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.953779 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.953798 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.953827 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.953848 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:42Z","lastTransitionTime":"2026-02-17T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.057086 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.057151 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.057166 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.057215 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.057227 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:43Z","lastTransitionTime":"2026-02-17T13:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.159957 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.160041 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.160053 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.160075 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.160089 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:43Z","lastTransitionTime":"2026-02-17T13:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.263600 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.263668 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.263687 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.263717 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.263737 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:43Z","lastTransitionTime":"2026-02-17T13:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.284958 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/2.log" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.287533 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe"} Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.288072 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.309357 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:15Z\\\",\\\"message\\\":\\\" 6456 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661364 6456 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661403 6456 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661438 6456 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.671636 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 13:26:15.671766 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 13:26:15.671804 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 13:26:15.671975 6456 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 13:26:15.671988 6456 factory.go:656] Stopping watch factory\\\\nI0217 13:26:15.690620 6456 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 13:26:15.690795 6456 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 13:26:15.690976 6456 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:26:15.691073 6456 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 13:26:15.691307 6456 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.321690 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.334415 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.345671 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.358437 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.366965 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.367007 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.367023 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.367043 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.367056 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:43Z","lastTransitionTime":"2026-02-17T13:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.370822 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.386121 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.405603 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.420474 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.436287 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.449757 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.466151 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"2026-02-17T13:25:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307\\\\n2026-02-17T13:25:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307 to /host/opt/cni/bin/\\\\n2026-02-17T13:25:50Z [verbose] multus-daemon started\\\\n2026-02-17T13:25:50Z [verbose] Readiness Indicator file check\\\\n2026-02-17T13:26:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.470299 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.470338 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.470346 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.470363 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.470374 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:43Z","lastTransitionTime":"2026-02-17T13:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.484858 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.499733 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.512191 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40bdd16b-bf07-4c36-80dd-66372f796f39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbdcf29ae01c66c5b1c3ed9c4aa7b3a6451bb49e10d52c319a2744105303386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b261eeb544e51ee2adb543542186c18759e667e1339fd724a3dce82b0391aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8a58720e24f8e51cc0749b46bb8ae7a1ecd576acaf200cce798a109e2e46fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.525242 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.538720 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.565723 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 13:03:39.197563437 +0000 UTC Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.573122 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.573270 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.573128 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:43 crc kubenswrapper[4804]: E0217 13:26:43.573383 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:43 crc kubenswrapper[4804]: E0217 13:26:43.573539 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:43 crc kubenswrapper[4804]: E0217 13:26:43.573651 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.573709 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:43 crc kubenswrapper[4804]: E0217 13:26:43.573851 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.575461 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.575498 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.575508 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.575531 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.575549 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:43Z","lastTransitionTime":"2026-02-17T13:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.678086 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.678145 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.678157 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.678178 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.678190 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:43Z","lastTransitionTime":"2026-02-17T13:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.781449 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.781516 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.781537 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.781568 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.781586 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:43Z","lastTransitionTime":"2026-02-17T13:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.884719 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.884794 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.884814 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.884866 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.884911 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:43Z","lastTransitionTime":"2026-02-17T13:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.988428 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.988498 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.988516 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.988545 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.988565 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:43Z","lastTransitionTime":"2026-02-17T13:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.092674 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.092766 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.092790 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.092825 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.092849 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:44Z","lastTransitionTime":"2026-02-17T13:26:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.195471 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.195567 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.195595 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.195629 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.195652 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:44Z","lastTransitionTime":"2026-02-17T13:26:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.293799 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/3.log" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.294413 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/2.log" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.297624 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.297675 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.297695 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.297719 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.297741 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:44Z","lastTransitionTime":"2026-02-17T13:26:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.298321 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" exitCode=1 Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.298436 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe"} Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.298557 4804 scope.go:117] "RemoveContainer" containerID="e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.299398 4804 scope.go:117] "RemoveContainer" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:26:44 crc kubenswrapper[4804]: E0217 13:26:44.299654 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.323497 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.344429 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.363081 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.386024 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.401176 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.401239 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.401254 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.401276 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.401291 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:44Z","lastTransitionTime":"2026-02-17T13:26:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.410882 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:15Z\\\",\\\"message\\\":\\\" 6456 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661364 6456 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661403 6456 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661438 6456 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.671636 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 13:26:15.671766 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 13:26:15.671804 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 13:26:15.671975 6456 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 13:26:15.671988 6456 factory.go:656] Stopping watch factory\\\\nI0217 13:26:15.690620 6456 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 13:26:15.690795 6456 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 13:26:15.690976 6456 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:26:15.691073 6456 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 13:26:15.691307 6456 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:43Z\\\",\\\"message\\\":\\\"v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0217 13:26:43.482340 6863 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z]\\\\nI0217 13:26:43.482350 6863 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-z522z\\\\nI0217 13:26:43.482354 6863 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-4q55t\\\\nI0217 13:26:43.482359 6863 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-z522z\\\\nI0217 13:26:43.482364 6863 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-ku\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.428142 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.443571 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.459671 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.477320 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.494757 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.505277 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.505332 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.505351 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.505381 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.505400 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:44Z","lastTransitionTime":"2026-02-17T13:26:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.516450 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.540887 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.556293 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.566401 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 12:09:38.276596308 +0000 UTC Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.575932 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"2026-02-17T13:25:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307\\\\n2026-02-17T13:25:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307 to /host/opt/cni/bin/\\\\n2026-02-17T13:25:50Z [verbose] multus-daemon started\\\\n2026-02-17T13:25:50Z [verbose] Readiness Indicator file check\\\\n2026-02-17T13:26:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.592057 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.606417 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.609241 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.609295 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.609305 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.609351 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.609362 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:44Z","lastTransitionTime":"2026-02-17T13:26:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.619631 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40bdd16b-bf07-4c36-80dd-66372f796f39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbdcf29ae01c66c5b1c3ed9c4aa7b3a6451bb49e10d52c319a2744105303386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b261eeb544e51ee2adb543542186c18759e667e1339fd724a3dce82b0391aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8a58720e24f8e51cc0749b46bb8ae7a1ecd576acaf200cce798a109e2e46fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.712617 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.712675 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.712687 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.712712 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.712725 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:44Z","lastTransitionTime":"2026-02-17T13:26:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.815990 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.816082 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.816104 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.816129 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.816146 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:44Z","lastTransitionTime":"2026-02-17T13:26:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.920162 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.920260 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.920277 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.920303 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.920320 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:44Z","lastTransitionTime":"2026-02-17T13:26:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.024071 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.024143 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.024162 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.024189 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.024252 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.127190 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.127303 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.127321 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.127344 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.127358 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.230715 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.230772 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.230789 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.230818 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.230832 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.303495 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/3.log" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.307910 4804 scope.go:117] "RemoveContainer" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:26:45 crc kubenswrapper[4804]: E0217 13:26:45.308127 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.323964 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.334273 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.334354 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.334374 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.334408 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.334428 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.341731 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.364995 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:43Z\\\",\\\"message\\\":\\\"v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0217 13:26:43.482340 6863 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z]\\\\nI0217 13:26:43.482350 6863 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-z522z\\\\nI0217 13:26:43.482354 6863 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-4q55t\\\\nI0217 13:26:43.482359 6863 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-z522z\\\\nI0217 13:26:43.482364 6863 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-ku\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.381148 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.399330 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.415666 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.431339 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.437277 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.437327 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.437341 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.437365 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.437381 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.445156 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.467751 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.482549 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.496784 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.512336 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.523773 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.539538 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"2026-02-17T13:25:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307\\\\n2026-02-17T13:25:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307 to /host/opt/cni/bin/\\\\n2026-02-17T13:25:50Z [verbose] multus-daemon started\\\\n2026-02-17T13:25:50Z [verbose] Readiness Indicator file check\\\\n2026-02-17T13:26:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.542268 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.542700 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.542834 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.543231 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.543389 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.559668 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.568000 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 10:11:18.903609411 +0000 UTC Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.573580 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:45 crc kubenswrapper[4804]: E0217 13:26:45.573786 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.574299 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:45 crc kubenswrapper[4804]: E0217 13:26:45.574423 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.574663 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.574860 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:45 crc kubenswrapper[4804]: E0217 13:26:45.575102 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:45 crc kubenswrapper[4804]: E0217 13:26:45.574908 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.575454 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.593049 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40bdd16b-bf07-4c36-80dd-66372f796f39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbdcf29ae01c66c5b1c3ed9c4aa7b3a6451bb49e10d52c319a2744105303386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b261eeb544e51ee2adb543542186c18759e667e1339fd724a3dce82b0391aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8a58720e24f8e51cc0749b46bb8ae7a1ecd576acaf200cce798a109e2e46fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.646491 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.646545 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.646567 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.646594 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.646613 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.708957 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.709003 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.709021 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.709047 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.709065 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: E0217 13:26:45.729599 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.736418 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.736733 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.736857 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.736994 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.737374 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: E0217 13:26:45.762312 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.769393 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.769438 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.769452 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.769477 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.769492 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: E0217 13:26:45.788451 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.793720 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.793756 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.793768 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.793791 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.793805 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: E0217 13:26:45.805791 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.809716 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.809751 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.809760 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.809778 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.809789 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: E0217 13:26:45.826279 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: E0217 13:26:45.826514 4804 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.828949 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.829003 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.829022 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.829047 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.829063 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.931687 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.931752 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.931769 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.931798 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.931816 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.035832 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.036419 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.036592 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.036745 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.036871 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:46Z","lastTransitionTime":"2026-02-17T13:26:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.140493 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.140560 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.140577 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.140605 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.140623 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:46Z","lastTransitionTime":"2026-02-17T13:26:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.245263 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.245315 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.245327 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.245353 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.245371 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:46Z","lastTransitionTime":"2026-02-17T13:26:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.347825 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.347904 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.347918 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.347938 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.347949 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:46Z","lastTransitionTime":"2026-02-17T13:26:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.451186 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.451247 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.451259 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.451281 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.451294 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:46Z","lastTransitionTime":"2026-02-17T13:26:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.554498 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.554561 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.554581 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.554609 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.554626 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:46Z","lastTransitionTime":"2026-02-17T13:26:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.569295 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 01:51:04.871056412 +0000 UTC Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.591537 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.605385 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.624021 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.642287 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.657479 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.657519 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.657530 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.657554 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.657570 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:46Z","lastTransitionTime":"2026-02-17T13:26:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.658411 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.674158 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40bdd16b-bf07-4c36-80dd-66372f796f39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbdcf29ae01c66c5b1c3ed9c4aa7b3a6451bb49e10d52c319a2744105303386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b261eeb544e51ee2adb543542186c18759e667e1339fd724a3dce82b0391aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8a58720e24f8e51cc0749b46bb8ae7a1ecd576acaf200cce798a109e2e46fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.691240 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.712686 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.727550 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.740922 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"2026-02-17T13:25:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307\\\\n2026-02-17T13:25:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307 to /host/opt/cni/bin/\\\\n2026-02-17T13:25:50Z [verbose] multus-daemon started\\\\n2026-02-17T13:25:50Z [verbose] Readiness Indicator file check\\\\n2026-02-17T13:26:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.754554 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.759471 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.759504 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.759513 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.759531 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.759540 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:46Z","lastTransitionTime":"2026-02-17T13:26:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.768376 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.785059 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.804624 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.815653 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.835552 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:43Z\\\",\\\"message\\\":\\\"v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0217 13:26:43.482340 6863 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z]\\\\nI0217 13:26:43.482350 6863 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-z522z\\\\nI0217 13:26:43.482354 6863 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-4q55t\\\\nI0217 13:26:43.482359 6863 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-z522z\\\\nI0217 13:26:43.482364 6863 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-ku\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.850783 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.861853 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.861907 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.861924 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.861950 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.861967 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:46Z","lastTransitionTime":"2026-02-17T13:26:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.964484 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.964804 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.964900 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.964973 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.965120 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:46Z","lastTransitionTime":"2026-02-17T13:26:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.068572 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.068638 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.068648 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.068669 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.068680 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:47Z","lastTransitionTime":"2026-02-17T13:26:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.172799 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.173283 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.173469 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.173675 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.173812 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:47Z","lastTransitionTime":"2026-02-17T13:26:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.277150 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.277226 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.277238 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.277256 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.277267 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:47Z","lastTransitionTime":"2026-02-17T13:26:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.380544 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.380635 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.380661 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.380697 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.380725 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:47Z","lastTransitionTime":"2026-02-17T13:26:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.483262 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.483308 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.483320 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.483340 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.483352 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:47Z","lastTransitionTime":"2026-02-17T13:26:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.570545 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 15:22:02.478099346 +0000 UTC Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.573117 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.573352 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:47 crc kubenswrapper[4804]: E0217 13:26:47.573342 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.573451 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:47 crc kubenswrapper[4804]: E0217 13:26:47.573583 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.573639 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:47 crc kubenswrapper[4804]: E0217 13:26:47.573739 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:47 crc kubenswrapper[4804]: E0217 13:26:47.573855 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.587282 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.587348 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.587373 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.587405 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.587427 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:47Z","lastTransitionTime":"2026-02-17T13:26:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.691366 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.691418 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.691437 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.691461 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.691480 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:47Z","lastTransitionTime":"2026-02-17T13:26:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.794938 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.794997 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.795015 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.795045 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.795066 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:47Z","lastTransitionTime":"2026-02-17T13:26:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.898327 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.898368 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.898377 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.898396 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.898407 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:47Z","lastTransitionTime":"2026-02-17T13:26:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.001969 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.002025 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.002036 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.002064 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.002074 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:48Z","lastTransitionTime":"2026-02-17T13:26:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.105227 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.105277 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.105290 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.105311 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.105325 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:48Z","lastTransitionTime":"2026-02-17T13:26:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.208812 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.208885 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.208902 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.208931 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.208949 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:48Z","lastTransitionTime":"2026-02-17T13:26:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.311939 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.312032 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.312056 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.312089 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.312114 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:48Z","lastTransitionTime":"2026-02-17T13:26:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.414297 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.414349 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.414366 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.414388 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.414402 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:48Z","lastTransitionTime":"2026-02-17T13:26:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.518281 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.518419 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.518451 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.518486 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.518509 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:48Z","lastTransitionTime":"2026-02-17T13:26:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.571226 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 16:53:49.741718618 +0000 UTC Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.621362 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.621403 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.621429 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.621448 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.621460 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:48Z","lastTransitionTime":"2026-02-17T13:26:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.729013 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.729097 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.729124 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.729170 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.729193 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:48Z","lastTransitionTime":"2026-02-17T13:26:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.832919 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.832989 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.833013 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.833044 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.833067 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:48Z","lastTransitionTime":"2026-02-17T13:26:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.936128 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.936216 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.936234 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.936263 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.936282 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:48Z","lastTransitionTime":"2026-02-17T13:26:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.038939 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.038996 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.039013 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.039037 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.039055 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:49Z","lastTransitionTime":"2026-02-17T13:26:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.141769 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.141836 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.141853 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.141872 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.141884 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:49Z","lastTransitionTime":"2026-02-17T13:26:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.245056 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.245160 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.245183 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.245275 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.245296 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:49Z","lastTransitionTime":"2026-02-17T13:26:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.347670 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.347786 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.347810 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.347840 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.347860 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:49Z","lastTransitionTime":"2026-02-17T13:26:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.450566 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.450611 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.450622 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.450640 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.450649 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:49Z","lastTransitionTime":"2026-02-17T13:26:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.554552 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.554619 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.554638 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.554667 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.554685 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:49Z","lastTransitionTime":"2026-02-17T13:26:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.572418 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 05:29:46.294059487 +0000 UTC Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.573819 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.573904 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.573819 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:49 crc kubenswrapper[4804]: E0217 13:26:49.574016 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.574082 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:49 crc kubenswrapper[4804]: E0217 13:26:49.574267 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:49 crc kubenswrapper[4804]: E0217 13:26:49.574455 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:49 crc kubenswrapper[4804]: E0217 13:26:49.574619 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.658305 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.658430 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.658453 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.658751 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.658832 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:49Z","lastTransitionTime":"2026-02-17T13:26:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.762042 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.762123 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.762145 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.762183 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.762246 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:49Z","lastTransitionTime":"2026-02-17T13:26:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.866072 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.866179 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.866245 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.866293 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.866319 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:49Z","lastTransitionTime":"2026-02-17T13:26:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.969858 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.969978 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.970007 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.970047 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.970074 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:49Z","lastTransitionTime":"2026-02-17T13:26:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.073694 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.073782 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.073833 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.073875 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.073899 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:50Z","lastTransitionTime":"2026-02-17T13:26:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.176768 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.176873 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.176913 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.176951 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.176976 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:50Z","lastTransitionTime":"2026-02-17T13:26:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.280433 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.280498 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.280513 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.280533 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.280546 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:50Z","lastTransitionTime":"2026-02-17T13:26:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.383313 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.383388 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.383431 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.383469 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.383490 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:50Z","lastTransitionTime":"2026-02-17T13:26:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.487478 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.487561 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.487595 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.487632 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.487701 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:50Z","lastTransitionTime":"2026-02-17T13:26:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.573425 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 00:47:07.574485598 +0000 UTC Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.591074 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.591141 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.591159 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.591186 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.591230 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:50Z","lastTransitionTime":"2026-02-17T13:26:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.695095 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.695154 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.695167 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.695187 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.695221 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:50Z","lastTransitionTime":"2026-02-17T13:26:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.797173 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.797240 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.797251 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.797269 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.797278 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:50Z","lastTransitionTime":"2026-02-17T13:26:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.900422 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.900465 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.900478 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.900498 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.900515 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:50Z","lastTransitionTime":"2026-02-17T13:26:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.004494 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.004532 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.004540 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.004557 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.004565 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:51Z","lastTransitionTime":"2026-02-17T13:26:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.107806 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.107860 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.107875 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.107897 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.107916 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:51Z","lastTransitionTime":"2026-02-17T13:26:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.211038 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.211105 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.211118 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.211137 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.211154 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:51Z","lastTransitionTime":"2026-02-17T13:26:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.313381 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.313436 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.313452 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.313477 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.313494 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:51Z","lastTransitionTime":"2026-02-17T13:26:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.349733 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.349841 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.349878 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.349912 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.349938 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350083 4804 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350143 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.350124632 +0000 UTC m=+149.461543979 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350140 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350244 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350262 4804 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350299 4804 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350173 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350379 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350334 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.350312958 +0000 UTC m=+149.461732305 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350407 4804 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350446 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.350412632 +0000 UTC m=+149.461832119 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350503 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.350476884 +0000 UTC m=+149.461896351 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350629 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.350607057 +0000 UTC m=+149.462026434 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.422741 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.422809 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.422831 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.422862 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.422891 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:51Z","lastTransitionTime":"2026-02-17T13:26:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.525952 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.526445 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.526475 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.526513 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.526538 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:51Z","lastTransitionTime":"2026-02-17T13:26:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.573030 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.573179 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.573071 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.573047 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.573341 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.573411 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.573530 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.573599 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 09:29:44.719726923 +0000 UTC Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.573870 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.630340 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.630443 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.630457 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.630481 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.630501 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:51Z","lastTransitionTime":"2026-02-17T13:26:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.734512 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.734576 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.734593 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.734623 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.734638 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:51Z","lastTransitionTime":"2026-02-17T13:26:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.837116 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.837238 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.837264 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.837296 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.837320 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:51Z","lastTransitionTime":"2026-02-17T13:26:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.940071 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.940150 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.940175 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.940236 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.940266 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:51Z","lastTransitionTime":"2026-02-17T13:26:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.043884 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.043959 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.043986 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.044019 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.044044 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:52Z","lastTransitionTime":"2026-02-17T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.147273 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.147361 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.147386 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.147422 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.147445 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:52Z","lastTransitionTime":"2026-02-17T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.251488 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.251567 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.251592 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.251625 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.251651 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:52Z","lastTransitionTime":"2026-02-17T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.355397 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.355473 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.355539 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.355571 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.355595 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:52Z","lastTransitionTime":"2026-02-17T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.459599 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.459717 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.459749 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.459784 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.459809 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:52Z","lastTransitionTime":"2026-02-17T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.563587 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.563706 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.563784 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.563823 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.563845 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:52Z","lastTransitionTime":"2026-02-17T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.573722 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 02:07:19.265326437 +0000 UTC Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.591621 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.666861 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.666927 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.666949 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.666980 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.667002 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:52Z","lastTransitionTime":"2026-02-17T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.769864 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.769923 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.769939 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.769958 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.769969 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:52Z","lastTransitionTime":"2026-02-17T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.873117 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.873180 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.873191 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.873228 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.873240 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:52Z","lastTransitionTime":"2026-02-17T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.976346 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.976410 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.976438 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.976474 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.976496 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:52Z","lastTransitionTime":"2026-02-17T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.079923 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.079972 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.079987 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.080006 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.080018 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:53Z","lastTransitionTime":"2026-02-17T13:26:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.183569 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.183650 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.183671 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.183746 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.183769 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:53Z","lastTransitionTime":"2026-02-17T13:26:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.287679 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.287781 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.287799 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.287826 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.287842 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:53Z","lastTransitionTime":"2026-02-17T13:26:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.398147 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.398190 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.398219 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.398238 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.398249 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:53Z","lastTransitionTime":"2026-02-17T13:26:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.501707 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.501756 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.501767 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.501784 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.501799 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:53Z","lastTransitionTime":"2026-02-17T13:26:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.573615 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.573686 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:53 crc kubenswrapper[4804]: E0217 13:26:53.573791 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.573717 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:53 crc kubenswrapper[4804]: E0217 13:26:53.573869 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.573976 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:53 crc kubenswrapper[4804]: E0217 13:26:53.574092 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.573984 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 09:34:25.807309553 +0000 UTC Feb 17 13:26:53 crc kubenswrapper[4804]: E0217 13:26:53.574246 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.605622 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.605699 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.605716 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.605744 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.605765 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:53Z","lastTransitionTime":"2026-02-17T13:26:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.708177 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.708235 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.708244 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.708260 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.708270 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:53Z","lastTransitionTime":"2026-02-17T13:26:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.812106 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.812172 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.812245 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.812283 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.812311 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:53Z","lastTransitionTime":"2026-02-17T13:26:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.915158 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.915234 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.915248 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.915269 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.915281 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:53Z","lastTransitionTime":"2026-02-17T13:26:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.018734 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.018837 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.018873 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.018912 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.018935 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:54Z","lastTransitionTime":"2026-02-17T13:26:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.122648 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.123138 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.123360 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.123414 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.123435 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:54Z","lastTransitionTime":"2026-02-17T13:26:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.226760 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.226835 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.226851 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.226879 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.226899 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:54Z","lastTransitionTime":"2026-02-17T13:26:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.329792 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.329869 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.329886 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.329909 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.329922 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:54Z","lastTransitionTime":"2026-02-17T13:26:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.433389 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.433458 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.433469 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.433488 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.433502 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:54Z","lastTransitionTime":"2026-02-17T13:26:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.537150 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.537242 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.537253 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.537271 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.537281 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:54Z","lastTransitionTime":"2026-02-17T13:26:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.574875 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 07:53:10.983068433 +0000 UTC Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.640193 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.640302 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.640326 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.640366 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.640408 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:54Z","lastTransitionTime":"2026-02-17T13:26:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.744176 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.744279 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.744299 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.744330 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.744348 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:54Z","lastTransitionTime":"2026-02-17T13:26:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.847404 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.847457 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.847475 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.847500 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.847519 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:54Z","lastTransitionTime":"2026-02-17T13:26:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.950040 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.950080 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.950089 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.950104 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.950113 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:54Z","lastTransitionTime":"2026-02-17T13:26:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.053688 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.053805 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.053829 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.054564 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.054645 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.158511 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.158582 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.158599 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.158626 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.158645 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.261971 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.262042 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.262053 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.262073 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.262088 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.364790 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.364851 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.364863 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.364880 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.364908 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.467921 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.467973 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.467989 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.468008 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.468023 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.571104 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.571514 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.571603 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.571686 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.571764 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.573462 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.573561 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.573466 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.573466 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:55 crc kubenswrapper[4804]: E0217 13:26:55.573892 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:55 crc kubenswrapper[4804]: E0217 13:26:55.574010 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:55 crc kubenswrapper[4804]: E0217 13:26:55.574239 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:55 crc kubenswrapper[4804]: E0217 13:26:55.574396 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.575634 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 07:23:18.463838243 +0000 UTC Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.674699 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.675005 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.675084 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.675269 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.675374 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.778927 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.779321 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.779432 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.779516 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.779599 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.882295 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.882635 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.882724 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.882808 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.882890 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.916360 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.916660 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.916782 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.916877 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.916960 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: E0217 13:26:55.938669 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.943705 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.943864 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.943959 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.944053 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.944137 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: E0217 13:26:55.964934 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.969772 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.970092 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.970214 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.970315 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.970433 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: E0217 13:26:55.988355 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.993455 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.993595 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.993876 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.993994 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.994092 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:56 crc kubenswrapper[4804]: E0217 13:26:56.011787 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.017281 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.017331 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.017352 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.017381 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.017403 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:56Z","lastTransitionTime":"2026-02-17T13:26:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:56 crc kubenswrapper[4804]: E0217 13:26:56.036262 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: E0217 13:26:56.036602 4804 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.039404 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.039461 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.039474 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.039501 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.039515 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:56Z","lastTransitionTime":"2026-02-17T13:26:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.613700 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 01:10:26.371310323 +0000 UTC Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.631342 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.631405 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.631420 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.631441 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.631454 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:56Z","lastTransitionTime":"2026-02-17T13:26:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.637927 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.653953 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.667685 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40bdd16b-bf07-4c36-80dd-66372f796f39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbdcf29ae01c66c5b1c3ed9c4aa7b3a6451bb49e10d52c319a2744105303386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b261eeb544e51ee2adb543542186c18759e667e1339fd724a3dce82b0391aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8a58720e24f8e51cc0749b46bb8ae7a1ecd576acaf200cce798a109e2e46fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.683353 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.698562 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.711503 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.728214 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"2026-02-17T13:25:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307\\\\n2026-02-17T13:25:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307 to /host/opt/cni/bin/\\\\n2026-02-17T13:25:50Z [verbose] multus-daemon started\\\\n2026-02-17T13:25:50Z [verbose] Readiness Indicator file check\\\\n2026-02-17T13:26:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.734028 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.734078 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.734091 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.734112 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.734127 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:56Z","lastTransitionTime":"2026-02-17T13:26:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.741591 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.755992 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.767310 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec20591e-8008-4af6-83b7-51eb41217805\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c16d90fa3ea6207e30c8c7d82c6d77586b791fbe1a490094e34f01371a61d89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31ae10eb113c4f6e69ec71e2ef5e301093278d304f6fd564c0bfaa68aae3df53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31ae10eb113c4f6e69ec71e2ef5e301093278d304f6fd564c0bfaa68aae3df53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.781383 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.800174 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:43Z\\\",\\\"message\\\":\\\"v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0217 13:26:43.482340 6863 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z]\\\\nI0217 13:26:43.482350 6863 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-z522z\\\\nI0217 13:26:43.482354 6863 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-4q55t\\\\nI0217 13:26:43.482359 6863 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-z522z\\\\nI0217 13:26:43.482364 6863 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-ku\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.814820 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.828178 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.836507 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.836554 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.836568 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.836585 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.836597 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:56Z","lastTransitionTime":"2026-02-17T13:26:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.840862 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.851846 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.863988 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.877365 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.939988 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.940056 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.940074 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.940100 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.940121 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:56Z","lastTransitionTime":"2026-02-17T13:26:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.042694 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.042742 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.042754 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.042777 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.042790 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:57Z","lastTransitionTime":"2026-02-17T13:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.145759 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.145837 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.145849 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.145872 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.145886 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:57Z","lastTransitionTime":"2026-02-17T13:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.248980 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.249042 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.249053 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.249075 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.249089 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:57Z","lastTransitionTime":"2026-02-17T13:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.351620 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.351673 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.351691 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.351709 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.351721 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:57Z","lastTransitionTime":"2026-02-17T13:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.454983 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.455051 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.455068 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.455097 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.455115 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:57Z","lastTransitionTime":"2026-02-17T13:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.559023 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.559077 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.559086 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.559105 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.559237 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:57Z","lastTransitionTime":"2026-02-17T13:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.573698 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.573745 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.573691 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:57 crc kubenswrapper[4804]: E0217 13:26:57.573840 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.573943 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:57 crc kubenswrapper[4804]: E0217 13:26:57.574130 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:57 crc kubenswrapper[4804]: E0217 13:26:57.574451 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:57 crc kubenswrapper[4804]: E0217 13:26:57.574957 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.575170 4804 scope.go:117] "RemoveContainer" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:26:57 crc kubenswrapper[4804]: E0217 13:26:57.575359 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.615307 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 03:59:15.388177514 +0000 UTC Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.662306 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.662722 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.662904 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.663060 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.663279 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:57Z","lastTransitionTime":"2026-02-17T13:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.767090 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.767669 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.767841 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.768070 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.768257 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:57Z","lastTransitionTime":"2026-02-17T13:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.871461 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.871885 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.872086 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.872276 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.872400 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:57Z","lastTransitionTime":"2026-02-17T13:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.975520 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.975926 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.976266 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.976664 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.977030 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:57Z","lastTransitionTime":"2026-02-17T13:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.080006 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.080098 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.080112 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.080132 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.080146 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:58Z","lastTransitionTime":"2026-02-17T13:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.183583 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.183672 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.183821 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.183857 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.183876 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:58Z","lastTransitionTime":"2026-02-17T13:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.286836 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.286891 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.286909 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.286937 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.286955 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:58Z","lastTransitionTime":"2026-02-17T13:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.390700 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.390783 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.390806 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.390838 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.390856 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:58Z","lastTransitionTime":"2026-02-17T13:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.494754 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.494815 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.494825 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.494844 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.494858 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:58Z","lastTransitionTime":"2026-02-17T13:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.598396 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.598455 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.598471 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.598493 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.598506 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:58Z","lastTransitionTime":"2026-02-17T13:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.615660 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 01:30:09.590195273 +0000 UTC Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.701009 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.701116 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.701131 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.701152 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.701165 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:58Z","lastTransitionTime":"2026-02-17T13:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.804372 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.804431 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.804442 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.804465 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.804485 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:58Z","lastTransitionTime":"2026-02-17T13:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.907960 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.908009 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.908020 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.908042 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.908057 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:58Z","lastTransitionTime":"2026-02-17T13:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.012158 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.012246 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.012261 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.012281 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.012293 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:59Z","lastTransitionTime":"2026-02-17T13:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.116611 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.116659 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.116671 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.116691 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.116704 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:59Z","lastTransitionTime":"2026-02-17T13:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.219642 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.219713 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.219727 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.219751 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.219766 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:59Z","lastTransitionTime":"2026-02-17T13:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.323100 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.323167 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.323190 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.323248 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.323272 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:59Z","lastTransitionTime":"2026-02-17T13:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.426436 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.426515 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.426536 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.426566 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.426587 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:59Z","lastTransitionTime":"2026-02-17T13:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.529720 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.529786 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.529807 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.529835 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.529855 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:59Z","lastTransitionTime":"2026-02-17T13:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.573831 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.573912 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.573916 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.573848 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:59 crc kubenswrapper[4804]: E0217 13:26:59.574018 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:59 crc kubenswrapper[4804]: E0217 13:26:59.574248 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:59 crc kubenswrapper[4804]: E0217 13:26:59.574365 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:59 crc kubenswrapper[4804]: E0217 13:26:59.574515 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.616528 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 09:36:49.049420349 +0000 UTC Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.633258 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.633351 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.633379 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.633410 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.633432 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:59Z","lastTransitionTime":"2026-02-17T13:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.736812 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.736895 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.736915 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.736945 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.736964 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:59Z","lastTransitionTime":"2026-02-17T13:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.840358 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.840437 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.840458 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.840484 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.840503 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:59Z","lastTransitionTime":"2026-02-17T13:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.944320 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.944398 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.944421 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.944463 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.944487 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:59Z","lastTransitionTime":"2026-02-17T13:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.047480 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.047534 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.047552 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.047578 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.047625 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:00Z","lastTransitionTime":"2026-02-17T13:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.151494 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.151579 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.151602 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.151629 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.151647 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:00Z","lastTransitionTime":"2026-02-17T13:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.254700 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.254771 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.254783 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.254805 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.254820 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:00Z","lastTransitionTime":"2026-02-17T13:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.357980 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.358049 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.358059 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.358095 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.358107 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:00Z","lastTransitionTime":"2026-02-17T13:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.462300 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.462689 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.462776 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.462867 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.462931 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:00Z","lastTransitionTime":"2026-02-17T13:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.566665 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.566721 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.566731 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.566749 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.566760 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:00Z","lastTransitionTime":"2026-02-17T13:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.616928 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 19:06:51.72946099 +0000 UTC Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.669892 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.669949 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.669968 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.669991 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.670003 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:00Z","lastTransitionTime":"2026-02-17T13:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.773256 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.773327 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.773346 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.773374 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.773395 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:00Z","lastTransitionTime":"2026-02-17T13:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.876676 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.876781 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.876807 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.876839 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.876863 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:00Z","lastTransitionTime":"2026-02-17T13:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.980481 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.980559 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.980573 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.980616 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.980629 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:00Z","lastTransitionTime":"2026-02-17T13:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.083772 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.083861 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.083884 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.083907 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.083943 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:01Z","lastTransitionTime":"2026-02-17T13:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.187429 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.187493 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.187512 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.187538 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.187556 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:01Z","lastTransitionTime":"2026-02-17T13:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.290810 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.290874 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.290892 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.290920 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.290938 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:01Z","lastTransitionTime":"2026-02-17T13:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.394150 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.394284 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.394314 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.394346 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.394376 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:01Z","lastTransitionTime":"2026-02-17T13:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.498261 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.498686 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.498899 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.499153 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.499409 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:01Z","lastTransitionTime":"2026-02-17T13:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.573800 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.573910 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.573921 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:01 crc kubenswrapper[4804]: E0217 13:27:01.574017 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:01 crc kubenswrapper[4804]: E0217 13:27:01.574168 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.574261 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:01 crc kubenswrapper[4804]: E0217 13:27:01.574428 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:01 crc kubenswrapper[4804]: E0217 13:27:01.574633 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.602921 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.602976 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.602994 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.603021 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.603041 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:01Z","lastTransitionTime":"2026-02-17T13:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.617559 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 07:59:26.942004683 +0000 UTC Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.706751 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.706807 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.706829 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.706863 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.706885 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:01Z","lastTransitionTime":"2026-02-17T13:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.810438 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.810499 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.810518 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.810544 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.810562 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:01Z","lastTransitionTime":"2026-02-17T13:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.914176 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.914292 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.914311 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.914337 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.914356 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:01Z","lastTransitionTime":"2026-02-17T13:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.017978 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.018044 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.018062 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.018089 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.018109 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:02Z","lastTransitionTime":"2026-02-17T13:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.122144 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.122262 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.122288 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.122322 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.122348 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:02Z","lastTransitionTime":"2026-02-17T13:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.225796 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.225883 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.225907 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.225939 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.225960 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:02Z","lastTransitionTime":"2026-02-17T13:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.330109 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.330280 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.330305 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.330331 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.330350 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:02Z","lastTransitionTime":"2026-02-17T13:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.435342 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.435398 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.435415 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.435441 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.435454 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:02Z","lastTransitionTime":"2026-02-17T13:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.539043 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.539123 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.539149 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.539185 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.539263 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:02Z","lastTransitionTime":"2026-02-17T13:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.618407 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 00:09:15.919003072 +0000 UTC Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.642188 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.642261 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.642276 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.642298 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.642316 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:02Z","lastTransitionTime":"2026-02-17T13:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.745448 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.745500 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.745512 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.745533 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.745549 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:02Z","lastTransitionTime":"2026-02-17T13:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.848124 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.848223 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.848246 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.848275 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.848295 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:02Z","lastTransitionTime":"2026-02-17T13:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.951265 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.951349 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.951364 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.951383 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.951393 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:02Z","lastTransitionTime":"2026-02-17T13:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.055058 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.055114 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.055125 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.055144 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.055157 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:03Z","lastTransitionTime":"2026-02-17T13:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.160172 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.160325 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.160357 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.160402 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.160446 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:03Z","lastTransitionTime":"2026-02-17T13:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.264778 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.264836 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.264849 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.264868 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.264880 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:03Z","lastTransitionTime":"2026-02-17T13:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.368058 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.368123 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.368142 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.368171 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.368190 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:03Z","lastTransitionTime":"2026-02-17T13:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.471667 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.471727 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.471745 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.471771 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.471789 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:03Z","lastTransitionTime":"2026-02-17T13:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.573330 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.573315 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.573369 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.573466 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:03 crc kubenswrapper[4804]: E0217 13:27:03.573518 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:03 crc kubenswrapper[4804]: E0217 13:27:03.573967 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:03 crc kubenswrapper[4804]: E0217 13:27:03.574051 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:03 crc kubenswrapper[4804]: E0217 13:27:03.574469 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.575250 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.575335 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.575355 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.575380 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.575399 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:03Z","lastTransitionTime":"2026-02-17T13:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.619794 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 09:02:24.547241814 +0000 UTC Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.678551 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.678706 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.678732 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.678771 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.678794 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:03Z","lastTransitionTime":"2026-02-17T13:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.782723 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.782844 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.782869 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.782897 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.782917 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:03Z","lastTransitionTime":"2026-02-17T13:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.886797 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.886848 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.886860 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.886881 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.886894 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:03Z","lastTransitionTime":"2026-02-17T13:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.990466 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.990539 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.990558 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.990586 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.990606 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:03Z","lastTransitionTime":"2026-02-17T13:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.094776 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.094831 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.094849 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.094872 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.094887 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:04Z","lastTransitionTime":"2026-02-17T13:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.197471 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.197537 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.197549 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.197575 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.197591 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:04Z","lastTransitionTime":"2026-02-17T13:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.301360 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.301436 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.301454 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.301490 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.301509 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:04Z","lastTransitionTime":"2026-02-17T13:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.405405 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.405486 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.405518 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.405553 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.405577 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:04Z","lastTransitionTime":"2026-02-17T13:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.508689 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.508751 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.508768 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.508794 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.508810 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:04Z","lastTransitionTime":"2026-02-17T13:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.613601 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.613664 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.613677 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.613699 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.613717 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:04Z","lastTransitionTime":"2026-02-17T13:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.620270 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 11:34:49.128726753 +0000 UTC Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.717569 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.717617 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.717629 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.717655 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.717669 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:04Z","lastTransitionTime":"2026-02-17T13:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.820338 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.820392 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.820409 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.820438 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.820462 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:04Z","lastTransitionTime":"2026-02-17T13:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.922646 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.922681 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.922693 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.922713 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.922761 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:04Z","lastTransitionTime":"2026-02-17T13:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.025116 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.025168 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.025180 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.025230 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.025245 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:05Z","lastTransitionTime":"2026-02-17T13:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.128265 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.128322 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.128335 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.128354 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.128367 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:05Z","lastTransitionTime":"2026-02-17T13:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.231760 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.231830 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.231848 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.231874 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.231893 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:05Z","lastTransitionTime":"2026-02-17T13:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.334730 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.334811 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.334833 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.334862 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.334881 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:05Z","lastTransitionTime":"2026-02-17T13:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.438526 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.438589 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.438608 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.438636 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.438655 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:05Z","lastTransitionTime":"2026-02-17T13:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.541889 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.542000 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.542027 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.542063 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.542084 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:05Z","lastTransitionTime":"2026-02-17T13:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.573170 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.573239 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.573264 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.573320 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:05 crc kubenswrapper[4804]: E0217 13:27:05.573411 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:05 crc kubenswrapper[4804]: E0217 13:27:05.573620 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:05 crc kubenswrapper[4804]: E0217 13:27:05.573959 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:05 crc kubenswrapper[4804]: E0217 13:27:05.574116 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.621101 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 01:23:00.111668103 +0000 UTC Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.645151 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.645282 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.645322 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.645357 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.645379 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:05Z","lastTransitionTime":"2026-02-17T13:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.748962 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.749043 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.749112 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.749149 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.749173 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:05Z","lastTransitionTime":"2026-02-17T13:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.832190 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:05 crc kubenswrapper[4804]: E0217 13:27:05.832536 4804 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:27:05 crc kubenswrapper[4804]: E0217 13:27:05.832645 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs podName:e77722ba-d383-442c-b6dc-9983cf233257 nodeName:}" failed. No retries permitted until 2026-02-17 13:28:09.832619158 +0000 UTC m=+163.944038525 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs") pod "network-metrics-daemon-4jfgm" (UID: "e77722ba-d383-442c-b6dc-9983cf233257") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.852615 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.852686 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.852711 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.852745 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.852768 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:05Z","lastTransitionTime":"2026-02-17T13:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.956636 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.956704 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.956724 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.956750 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.956766 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:05Z","lastTransitionTime":"2026-02-17T13:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.060319 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.060411 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.060436 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.060467 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.060489 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:06Z","lastTransitionTime":"2026-02-17T13:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.143462 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.143542 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.143560 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.143592 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.143612 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:06Z","lastTransitionTime":"2026-02-17T13:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.218605 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h"] Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.219162 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: W0217 13:27:06.221766 4804 reflector.go:561] object-"openshift-cluster-version"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-version": no relationship found between node 'crc' and this object Feb 17 13:27:06 crc kubenswrapper[4804]: E0217 13:27:06.221818 4804 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-version\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-version\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.223819 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.224035 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.230504 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.278270 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4q55t" podStartSLOduration=79.278244135 podStartE2EDuration="1m19.278244135s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:06.255114443 +0000 UTC m=+100.366533790" watchObservedRunningTime="2026-02-17 13:27:06.278244135 +0000 UTC m=+100.389663482" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.299776 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=76.299748785 podStartE2EDuration="1m16.299748785s" podCreationTimestamp="2026-02-17 13:25:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:06.279044672 +0000 UTC m=+100.390464029" watchObservedRunningTime="2026-02-17 13:27:06.299748785 +0000 UTC m=+100.411168132" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.317906 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=48.317879032 podStartE2EDuration="48.317879032s" podCreationTimestamp="2026-02-17 13:26:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:06.300585381 +0000 UTC m=+100.412004728" watchObservedRunningTime="2026-02-17 13:27:06.317879032 +0000 UTC m=+100.429298379" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.339158 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.339226 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.339242 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.339261 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.339289 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.348017 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4fbbv" podStartSLOduration=79.347998654 podStartE2EDuration="1m19.347998654s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:06.347621643 +0000 UTC m=+100.459040990" watchObservedRunningTime="2026-02-17 13:27:06.347998654 +0000 UTC m=+100.459417991" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.383070 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kclvs" podStartSLOduration=79.38304584 podStartE2EDuration="1m19.38304584s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:06.365982837 +0000 UTC m=+100.477402174" watchObservedRunningTime="2026-02-17 13:27:06.38304584 +0000 UTC m=+100.494465177" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.414819 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=14.414792286 podStartE2EDuration="14.414792286s" podCreationTimestamp="2026-02-17 13:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:06.413776453 +0000 UTC m=+100.525195800" watchObservedRunningTime="2026-02-17 13:27:06.414792286 +0000 UTC m=+100.526211633" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.440855 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.440916 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.440950 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.440965 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.441002 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.441054 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.441092 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.448541 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.464864 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.496706 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" podStartSLOduration=78.496678915 podStartE2EDuration="1m18.496678915s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:06.496264141 +0000 UTC m=+100.607683478" watchObservedRunningTime="2026-02-17 13:27:06.496678915 +0000 UTC m=+100.608098272" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.545515 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=78.545482994 podStartE2EDuration="1m18.545482994s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:06.543924612 +0000 UTC m=+100.655343959" watchObservedRunningTime="2026-02-17 13:27:06.545482994 +0000 UTC m=+100.656902331" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.579685 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podStartSLOduration=79.579667631 podStartE2EDuration="1m19.579667631s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:06.578121499 +0000 UTC m=+100.689540836" watchObservedRunningTime="2026-02-17 13:27:06.579667631 +0000 UTC m=+100.691086968" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.602319 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-z522z" podStartSLOduration=79.602290886 podStartE2EDuration="1m19.602290886s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:06.590156366 +0000 UTC m=+100.701575703" watchObservedRunningTime="2026-02-17 13:27:06.602290886 +0000 UTC m=+100.713710233" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.621413 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 07:29:12.260451102 +0000 UTC Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.621621 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.677005 4804 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 13:27:07 crc kubenswrapper[4804]: I0217 13:27:07.050517 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 13:27:07 crc kubenswrapper[4804]: I0217 13:27:07.052134 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:07 crc kubenswrapper[4804]: I0217 13:27:07.142615 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:07 crc kubenswrapper[4804]: I0217 13:27:07.573554 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:07 crc kubenswrapper[4804]: I0217 13:27:07.573648 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:07 crc kubenswrapper[4804]: E0217 13:27:07.574282 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:07 crc kubenswrapper[4804]: I0217 13:27:07.573782 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:07 crc kubenswrapper[4804]: I0217 13:27:07.573669 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:07 crc kubenswrapper[4804]: E0217 13:27:07.574413 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:07 crc kubenswrapper[4804]: E0217 13:27:07.574060 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:07 crc kubenswrapper[4804]: E0217 13:27:07.574514 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:07 crc kubenswrapper[4804]: I0217 13:27:07.665486 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" event={"ID":"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa","Type":"ContainerStarted","Data":"ecf158e37e36ff5370e83ad58a5c6c536f3c699310efe0e23372d83a3a73fe6a"} Feb 17 13:27:07 crc kubenswrapper[4804]: I0217 13:27:07.665571 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" event={"ID":"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa","Type":"ContainerStarted","Data":"4b1c8010f5aaeda44df80b7740681f8f1788221ad9e605a1bac2f1b190c3f645"} Feb 17 13:27:07 crc kubenswrapper[4804]: I0217 13:27:07.684394 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" podStartSLOduration=80.684375011 podStartE2EDuration="1m20.684375011s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:07.684290629 +0000 UTC m=+101.795710016" watchObservedRunningTime="2026-02-17 13:27:07.684375011 +0000 UTC m=+101.795794348" Feb 17 13:27:09 crc kubenswrapper[4804]: I0217 13:27:09.573009 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:09 crc kubenswrapper[4804]: I0217 13:27:09.573078 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:09 crc kubenswrapper[4804]: I0217 13:27:09.573115 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:09 crc kubenswrapper[4804]: I0217 13:27:09.573183 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:09 crc kubenswrapper[4804]: E0217 13:27:09.573322 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:09 crc kubenswrapper[4804]: E0217 13:27:09.573504 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:09 crc kubenswrapper[4804]: E0217 13:27:09.573744 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:09 crc kubenswrapper[4804]: E0217 13:27:09.574187 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:09 crc kubenswrapper[4804]: I0217 13:27:09.574482 4804 scope.go:117] "RemoveContainer" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:27:09 crc kubenswrapper[4804]: E0217 13:27:09.574663 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" Feb 17 13:27:11 crc kubenswrapper[4804]: I0217 13:27:11.573390 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:11 crc kubenswrapper[4804]: I0217 13:27:11.573478 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:11 crc kubenswrapper[4804]: I0217 13:27:11.574582 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:11 crc kubenswrapper[4804]: I0217 13:27:11.574591 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:11 crc kubenswrapper[4804]: E0217 13:27:11.574694 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:11 crc kubenswrapper[4804]: E0217 13:27:11.574798 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:11 crc kubenswrapper[4804]: E0217 13:27:11.574830 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:11 crc kubenswrapper[4804]: E0217 13:27:11.574904 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:11 crc kubenswrapper[4804]: I0217 13:27:11.590721 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 17 13:27:13 crc kubenswrapper[4804]: I0217 13:27:13.573742 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:13 crc kubenswrapper[4804]: I0217 13:27:13.573774 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:13 crc kubenswrapper[4804]: E0217 13:27:13.573908 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:13 crc kubenswrapper[4804]: I0217 13:27:13.573993 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:13 crc kubenswrapper[4804]: I0217 13:27:13.574075 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:13 crc kubenswrapper[4804]: E0217 13:27:13.574022 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:13 crc kubenswrapper[4804]: E0217 13:27:13.574290 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:13 crc kubenswrapper[4804]: E0217 13:27:13.574558 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:15 crc kubenswrapper[4804]: I0217 13:27:15.573277 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:15 crc kubenswrapper[4804]: I0217 13:27:15.573299 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:15 crc kubenswrapper[4804]: E0217 13:27:15.573459 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:15 crc kubenswrapper[4804]: I0217 13:27:15.573732 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:15 crc kubenswrapper[4804]: E0217 13:27:15.573824 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:15 crc kubenswrapper[4804]: E0217 13:27:15.573985 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:15 crc kubenswrapper[4804]: I0217 13:27:15.574189 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:15 crc kubenswrapper[4804]: E0217 13:27:15.574458 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:16 crc kubenswrapper[4804]: I0217 13:27:16.608659 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=5.608636632 podStartE2EDuration="5.608636632s" podCreationTimestamp="2026-02-17 13:27:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:16.606190682 +0000 UTC m=+110.717610019" watchObservedRunningTime="2026-02-17 13:27:16.608636632 +0000 UTC m=+110.720055989" Feb 17 13:27:17 crc kubenswrapper[4804]: I0217 13:27:17.574020 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:17 crc kubenswrapper[4804]: I0217 13:27:17.574104 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:17 crc kubenswrapper[4804]: I0217 13:27:17.574153 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:17 crc kubenswrapper[4804]: E0217 13:27:17.574314 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:17 crc kubenswrapper[4804]: I0217 13:27:17.574335 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:17 crc kubenswrapper[4804]: E0217 13:27:17.574481 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:17 crc kubenswrapper[4804]: E0217 13:27:17.574578 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:17 crc kubenswrapper[4804]: E0217 13:27:17.574624 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:19 crc kubenswrapper[4804]: I0217 13:27:19.573103 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:19 crc kubenswrapper[4804]: I0217 13:27:19.573105 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:19 crc kubenswrapper[4804]: I0217 13:27:19.573178 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:19 crc kubenswrapper[4804]: I0217 13:27:19.573249 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:19 crc kubenswrapper[4804]: E0217 13:27:19.573383 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:19 crc kubenswrapper[4804]: E0217 13:27:19.573477 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:19 crc kubenswrapper[4804]: E0217 13:27:19.573702 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:19 crc kubenswrapper[4804]: E0217 13:27:19.573930 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:20 crc kubenswrapper[4804]: I0217 13:27:20.575479 4804 scope.go:117] "RemoveContainer" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:27:20 crc kubenswrapper[4804]: E0217 13:27:20.575791 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" Feb 17 13:27:21 crc kubenswrapper[4804]: I0217 13:27:21.574422 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:21 crc kubenswrapper[4804]: I0217 13:27:21.575035 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:21 crc kubenswrapper[4804]: I0217 13:27:21.575062 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:21 crc kubenswrapper[4804]: E0217 13:27:21.575269 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:21 crc kubenswrapper[4804]: I0217 13:27:21.576170 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:21 crc kubenswrapper[4804]: E0217 13:27:21.576488 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:21 crc kubenswrapper[4804]: E0217 13:27:21.576626 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:21 crc kubenswrapper[4804]: E0217 13:27:21.576764 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:22 crc kubenswrapper[4804]: I0217 13:27:22.720392 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/1.log" Feb 17 13:27:22 crc kubenswrapper[4804]: I0217 13:27:22.722024 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/0.log" Feb 17 13:27:22 crc kubenswrapper[4804]: I0217 13:27:22.722105 4804 generic.go:334] "Generic (PLEG): container finished" podID="42eec48d-c990-43e6-8348-d9f78997ec3b" containerID="2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a" exitCode=1 Feb 17 13:27:22 crc kubenswrapper[4804]: I0217 13:27:22.722170 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kclvs" event={"ID":"42eec48d-c990-43e6-8348-d9f78997ec3b","Type":"ContainerDied","Data":"2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a"} Feb 17 13:27:22 crc kubenswrapper[4804]: I0217 13:27:22.722277 4804 scope.go:117] "RemoveContainer" containerID="26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa" Feb 17 13:27:22 crc kubenswrapper[4804]: I0217 13:27:22.723084 4804 scope.go:117] "RemoveContainer" containerID="2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a" Feb 17 13:27:22 crc kubenswrapper[4804]: E0217 13:27:22.723454 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-kclvs_openshift-multus(42eec48d-c990-43e6-8348-d9f78997ec3b)\"" pod="openshift-multus/multus-kclvs" podUID="42eec48d-c990-43e6-8348-d9f78997ec3b" Feb 17 13:27:23 crc kubenswrapper[4804]: I0217 13:27:23.573290 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:23 crc kubenswrapper[4804]: I0217 13:27:23.573373 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:23 crc kubenswrapper[4804]: I0217 13:27:23.573289 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:23 crc kubenswrapper[4804]: I0217 13:27:23.573494 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:23 crc kubenswrapper[4804]: E0217 13:27:23.573430 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:23 crc kubenswrapper[4804]: E0217 13:27:23.573568 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:23 crc kubenswrapper[4804]: E0217 13:27:23.573779 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:23 crc kubenswrapper[4804]: E0217 13:27:23.573873 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:23 crc kubenswrapper[4804]: I0217 13:27:23.726417 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/1.log" Feb 17 13:27:25 crc kubenswrapper[4804]: I0217 13:27:25.573983 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:25 crc kubenswrapper[4804]: E0217 13:27:25.574113 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:25 crc kubenswrapper[4804]: I0217 13:27:25.574260 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:25 crc kubenswrapper[4804]: I0217 13:27:25.574260 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:25 crc kubenswrapper[4804]: E0217 13:27:25.574427 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:25 crc kubenswrapper[4804]: E0217 13:27:25.574546 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:25 crc kubenswrapper[4804]: I0217 13:27:25.574296 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:25 crc kubenswrapper[4804]: E0217 13:27:25.574651 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:26 crc kubenswrapper[4804]: E0217 13:27:26.590463 4804 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 17 13:27:26 crc kubenswrapper[4804]: E0217 13:27:26.710666 4804 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 13:27:27 crc kubenswrapper[4804]: I0217 13:27:27.573486 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:27 crc kubenswrapper[4804]: I0217 13:27:27.573526 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:27 crc kubenswrapper[4804]: I0217 13:27:27.573631 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:27 crc kubenswrapper[4804]: E0217 13:27:27.573692 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:27 crc kubenswrapper[4804]: E0217 13:27:27.573773 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:27 crc kubenswrapper[4804]: E0217 13:27:27.573922 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:27 crc kubenswrapper[4804]: I0217 13:27:27.574393 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:27 crc kubenswrapper[4804]: E0217 13:27:27.574539 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:29 crc kubenswrapper[4804]: I0217 13:27:29.573237 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:29 crc kubenswrapper[4804]: I0217 13:27:29.573313 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:29 crc kubenswrapper[4804]: E0217 13:27:29.573431 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:29 crc kubenswrapper[4804]: I0217 13:27:29.573513 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:29 crc kubenswrapper[4804]: E0217 13:27:29.573641 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:29 crc kubenswrapper[4804]: I0217 13:27:29.573706 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:29 crc kubenswrapper[4804]: E0217 13:27:29.573853 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:29 crc kubenswrapper[4804]: E0217 13:27:29.574027 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:31 crc kubenswrapper[4804]: I0217 13:27:31.573573 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:31 crc kubenswrapper[4804]: I0217 13:27:31.573617 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:31 crc kubenswrapper[4804]: E0217 13:27:31.573770 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:31 crc kubenswrapper[4804]: E0217 13:27:31.573954 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:31 crc kubenswrapper[4804]: I0217 13:27:31.573959 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:31 crc kubenswrapper[4804]: E0217 13:27:31.574074 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:31 crc kubenswrapper[4804]: I0217 13:27:31.573992 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:31 crc kubenswrapper[4804]: E0217 13:27:31.574455 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:31 crc kubenswrapper[4804]: E0217 13:27:31.712569 4804 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 13:27:33 crc kubenswrapper[4804]: I0217 13:27:33.573756 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:33 crc kubenswrapper[4804]: I0217 13:27:33.573889 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:33 crc kubenswrapper[4804]: E0217 13:27:33.573957 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:33 crc kubenswrapper[4804]: I0217 13:27:33.574048 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:33 crc kubenswrapper[4804]: I0217 13:27:33.574235 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:33 crc kubenswrapper[4804]: E0217 13:27:33.574243 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:33 crc kubenswrapper[4804]: E0217 13:27:33.574382 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:33 crc kubenswrapper[4804]: E0217 13:27:33.574515 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:34 crc kubenswrapper[4804]: I0217 13:27:34.574415 4804 scope.go:117] "RemoveContainer" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:27:34 crc kubenswrapper[4804]: I0217 13:27:34.770796 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/3.log" Feb 17 13:27:34 crc kubenswrapper[4804]: I0217 13:27:34.773767 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96"} Feb 17 13:27:34 crc kubenswrapper[4804]: I0217 13:27:34.775449 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:27:34 crc kubenswrapper[4804]: I0217 13:27:34.822145 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podStartSLOduration=107.822119374 podStartE2EDuration="1m47.822119374s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:34.820460109 +0000 UTC m=+128.931879486" watchObservedRunningTime="2026-02-17 13:27:34.822119374 +0000 UTC m=+128.933538751" Feb 17 13:27:35 crc kubenswrapper[4804]: I0217 13:27:35.489036 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4jfgm"] Feb 17 13:27:35 crc kubenswrapper[4804]: I0217 13:27:35.489165 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:35 crc kubenswrapper[4804]: E0217 13:27:35.489271 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:35 crc kubenswrapper[4804]: I0217 13:27:35.573300 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:35 crc kubenswrapper[4804]: I0217 13:27:35.573333 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:35 crc kubenswrapper[4804]: I0217 13:27:35.573293 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:35 crc kubenswrapper[4804]: E0217 13:27:35.573446 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:35 crc kubenswrapper[4804]: E0217 13:27:35.573587 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:35 crc kubenswrapper[4804]: E0217 13:27:35.573649 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:36 crc kubenswrapper[4804]: E0217 13:27:36.713826 4804 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 13:27:37 crc kubenswrapper[4804]: I0217 13:27:37.574001 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:37 crc kubenswrapper[4804]: I0217 13:27:37.574040 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:37 crc kubenswrapper[4804]: I0217 13:27:37.574094 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:37 crc kubenswrapper[4804]: E0217 13:27:37.574330 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:37 crc kubenswrapper[4804]: I0217 13:27:37.574372 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:37 crc kubenswrapper[4804]: E0217 13:27:37.574508 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:37 crc kubenswrapper[4804]: I0217 13:27:37.574769 4804 scope.go:117] "RemoveContainer" containerID="2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a" Feb 17 13:27:37 crc kubenswrapper[4804]: E0217 13:27:37.574904 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:37 crc kubenswrapper[4804]: E0217 13:27:37.575184 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:37 crc kubenswrapper[4804]: I0217 13:27:37.789430 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/1.log" Feb 17 13:27:38 crc kubenswrapper[4804]: I0217 13:27:38.797243 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/1.log" Feb 17 13:27:38 crc kubenswrapper[4804]: I0217 13:27:38.797326 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kclvs" event={"ID":"42eec48d-c990-43e6-8348-d9f78997ec3b","Type":"ContainerStarted","Data":"89324956d07c3785169619878354108e896d85eaace9f0e642b1b5ee9a981bde"} Feb 17 13:27:39 crc kubenswrapper[4804]: I0217 13:27:39.573690 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:39 crc kubenswrapper[4804]: I0217 13:27:39.573733 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:39 crc kubenswrapper[4804]: I0217 13:27:39.573713 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:39 crc kubenswrapper[4804]: I0217 13:27:39.573855 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:39 crc kubenswrapper[4804]: E0217 13:27:39.575084 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:39 crc kubenswrapper[4804]: E0217 13:27:39.575316 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:39 crc kubenswrapper[4804]: E0217 13:27:39.575513 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:39 crc kubenswrapper[4804]: E0217 13:27:39.575617 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:41 crc kubenswrapper[4804]: I0217 13:27:41.572950 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:41 crc kubenswrapper[4804]: E0217 13:27:41.574351 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:41 crc kubenswrapper[4804]: I0217 13:27:41.572985 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:41 crc kubenswrapper[4804]: E0217 13:27:41.574698 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:41 crc kubenswrapper[4804]: I0217 13:27:41.572962 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:41 crc kubenswrapper[4804]: I0217 13:27:41.573155 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:41 crc kubenswrapper[4804]: E0217 13:27:41.575031 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:41 crc kubenswrapper[4804]: E0217 13:27:41.575245 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:43 crc kubenswrapper[4804]: I0217 13:27:43.573854 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:43 crc kubenswrapper[4804]: I0217 13:27:43.573930 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:43 crc kubenswrapper[4804]: I0217 13:27:43.573850 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:43 crc kubenswrapper[4804]: I0217 13:27:43.575478 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:43 crc kubenswrapper[4804]: I0217 13:27:43.578614 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 13:27:43 crc kubenswrapper[4804]: I0217 13:27:43.579425 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 13:27:43 crc kubenswrapper[4804]: I0217 13:27:43.579598 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 13:27:43 crc kubenswrapper[4804]: I0217 13:27:43.579858 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 13:27:43 crc kubenswrapper[4804]: I0217 13:27:43.581926 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 13:27:43 crc kubenswrapper[4804]: I0217 13:27:43.582168 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.909973 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.958810 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv"] Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.959461 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.965583 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.967037 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.967570 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.967570 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.971972 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f"] Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.972677 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.973788 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.974046 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.975491 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.976883 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v8xf8"] Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.978045 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-w4nl5"] Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.978383 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.978732 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-w4nl5" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.979789 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-tz5vz"] Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.980592 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.982055 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.983162 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw"] Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.984185 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.986519 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn"] Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.987031 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bpzqw"] Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.987463 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.987468 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.993855 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-h48zc"] Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.994341 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035636 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-serving-cert\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035689 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-service-ca\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035718 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1f4b64e-0bcd-420e-a4d6-80918348ed75-auth-proxy-config\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035744 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-879pl\" (UniqueName: \"kubernetes.io/projected/4c36b00a-bd3f-424c-a67b-d828d782e60f-kube-api-access-879pl\") pod \"downloads-7954f5f757-w4nl5\" (UID: \"4c36b00a-bd3f-424c-a67b-d828d782e60f\") " pod="openshift-console/downloads-7954f5f757-w4nl5" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035766 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-trusted-ca-bundle\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035791 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-service-ca-bundle\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035814 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-oauth-serving-cert\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035836 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035870 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s664q\" (UniqueName: \"kubernetes.io/projected/88e84359-a2f8-4d55-96e4-fda2ff226372-kube-api-access-s664q\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035892 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6318da7c-2891-47a5-bebf-edd3da5a103a-etcd-service-ca\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035916 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fc9b807-491e-4540-80ba-dd9243fa514c-metrics-tls\") pod \"dns-operator-744455d44c-bpzqw\" (UID: \"7fc9b807-491e-4540-80ba-dd9243fa514c\") " pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035939 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-config\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035965 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6318da7c-2891-47a5-bebf-edd3da5a103a-serving-cert\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035987 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0593410f-5966-4c13-9978-dbb0dee5faab-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fw7rw\" (UID: \"0593410f-5966-4c13-9978-dbb0dee5faab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036015 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036057 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-oauth-config\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036097 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b1f4b64e-0bcd-420e-a4d6-80918348ed75-machine-approver-tls\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036124 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036156 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-serving-cert\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036292 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-config\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036339 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dc5q\" (UniqueName: \"kubernetes.io/projected/0593410f-5966-4c13-9978-dbb0dee5faab-kube-api-access-5dc5q\") pod \"cluster-samples-operator-665b6dd947-fw7rw\" (UID: \"0593410f-5966-4c13-9978-dbb0dee5faab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036362 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6318da7c-2891-47a5-bebf-edd3da5a103a-etcd-ca\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036402 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6nrg\" (UniqueName: \"kubernetes.io/projected/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-kube-api-access-x6nrg\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036435 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88e84359-a2f8-4d55-96e4-fda2ff226372-audit-policies\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036461 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88e84359-a2f8-4d55-96e4-fda2ff226372-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036486 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88e84359-a2f8-4d55-96e4-fda2ff226372-audit-dir\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036616 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq22q\" (UniqueName: \"kubernetes.io/projected/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-kube-api-access-xq22q\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036679 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/88e84359-a2f8-4d55-96e4-fda2ff226372-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036747 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88e84359-a2f8-4d55-96e4-fda2ff226372-serving-cert\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036787 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036826 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6318da7c-2891-47a5-bebf-edd3da5a103a-etcd-client\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036866 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f4b64e-0bcd-420e-a4d6-80918348ed75-config\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036900 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kp4p\" (UniqueName: \"kubernetes.io/projected/b1f4b64e-0bcd-420e-a4d6-80918348ed75-kube-api-access-6kp4p\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036935 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/88e84359-a2f8-4d55-96e4-fda2ff226372-encryption-config\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036967 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmjld\" (UniqueName: \"kubernetes.io/projected/6318da7c-2891-47a5-bebf-edd3da5a103a-kube-api-access-cmjld\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.037004 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6318da7c-2891-47a5-bebf-edd3da5a103a-config\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.037034 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf85x\" (UniqueName: \"kubernetes.io/projected/7fc9b807-491e-4540-80ba-dd9243fa514c-kube-api-access-qf85x\") pod \"dns-operator-744455d44c-bpzqw\" (UID: \"7fc9b807-491e-4540-80ba-dd9243fa514c\") " pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.037064 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/88e84359-a2f8-4d55-96e4-fda2ff226372-etcd-client\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.037111 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snrpl\" (UniqueName: \"kubernetes.io/projected/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-kube-api-access-snrpl\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.084540 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.084563 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.084663 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.085722 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-46w22"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.086377 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.093029 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.093293 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.093541 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.098589 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.117107 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.117222 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.117404 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.117529 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.117831 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118075 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118260 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118299 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118376 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118429 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118494 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118676 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118724 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118823 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118952 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118991 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118269 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.119252 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.119355 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.119459 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.124167 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.124659 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.125689 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.125970 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.126127 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.126428 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.126531 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.126564 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.126643 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.126732 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.128084 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.128171 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.130219 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.131025 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.131250 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.131416 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.131532 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.132095 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.131607 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-spfls"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.145224 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.132163 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.132224 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.132269 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.132319 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.132365 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.132405 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.132446 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.132491 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.132530 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.133176 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.144161 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.131789 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.145580 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.145814 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j8ggj"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.138035 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3ea797e4-54e0-4063-8d2b-647f6686e2a8-audit-dir\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.145951 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6318da7c-2891-47a5-bebf-edd3da5a103a-etcd-service-ca\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.145990 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-config\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146013 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fc9b807-491e-4540-80ba-dd9243fa514c-metrics-tls\") pod \"dns-operator-744455d44c-bpzqw\" (UID: \"7fc9b807-491e-4540-80ba-dd9243fa514c\") " pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146029 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6318da7c-2891-47a5-bebf-edd3da5a103a-serving-cert\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146042 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mcszv"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146048 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-image-import-ca\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146068 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-config\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146087 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0593410f-5966-4c13-9978-dbb0dee5faab-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fw7rw\" (UID: \"0593410f-5966-4c13-9978-dbb0dee5faab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146113 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146129 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-oauth-config\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146144 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-serving-cert\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146174 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b1f4b64e-0bcd-420e-a4d6-80918348ed75-machine-approver-tls\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146191 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146253 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dc5q\" (UniqueName: \"kubernetes.io/projected/0593410f-5966-4c13-9978-dbb0dee5faab-kube-api-access-5dc5q\") pod \"cluster-samples-operator-665b6dd947-fw7rw\" (UID: \"0593410f-5966-4c13-9978-dbb0dee5faab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146295 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6318da7c-2891-47a5-bebf-edd3da5a103a-etcd-ca\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146322 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-config\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146339 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88e84359-a2f8-4d55-96e4-fda2ff226372-audit-policies\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146354 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88e84359-a2f8-4d55-96e4-fda2ff226372-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146370 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88e84359-a2f8-4d55-96e4-fda2ff226372-audit-dir\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.149116 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-config\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.149763 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6318da7c-2891-47a5-bebf-edd3da5a103a-etcd-service-ca\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150143 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6nrg\" (UniqueName: \"kubernetes.io/projected/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-kube-api-access-x6nrg\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150177 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3ea797e4-54e0-4063-8d2b-647f6686e2a8-node-pullsecrets\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150218 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq22q\" (UniqueName: \"kubernetes.io/projected/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-kube-api-access-xq22q\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150249 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/88e84359-a2f8-4d55-96e4-fda2ff226372-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150300 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggj9b\" (UniqueName: \"kubernetes.io/projected/3ea797e4-54e0-4063-8d2b-647f6686e2a8-kube-api-access-ggj9b\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150316 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-etcd-serving-ca\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150336 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88e84359-a2f8-4d55-96e4-fda2ff226372-serving-cert\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150354 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f4b64e-0bcd-420e-a4d6-80918348ed75-config\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150368 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kp4p\" (UniqueName: \"kubernetes.io/projected/b1f4b64e-0bcd-420e-a4d6-80918348ed75-kube-api-access-6kp4p\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150385 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150405 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6318da7c-2891-47a5-bebf-edd3da5a103a-etcd-client\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150423 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmjld\" (UniqueName: \"kubernetes.io/projected/6318da7c-2891-47a5-bebf-edd3da5a103a-kube-api-access-cmjld\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150440 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/88e84359-a2f8-4d55-96e4-fda2ff226372-encryption-config\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150458 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6318da7c-2891-47a5-bebf-edd3da5a103a-config\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150473 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf85x\" (UniqueName: \"kubernetes.io/projected/7fc9b807-491e-4540-80ba-dd9243fa514c-kube-api-access-qf85x\") pod \"dns-operator-744455d44c-bpzqw\" (UID: \"7fc9b807-491e-4540-80ba-dd9243fa514c\") " pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150498 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/88e84359-a2f8-4d55-96e4-fda2ff226372-etcd-client\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150519 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snrpl\" (UniqueName: \"kubernetes.io/projected/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-kube-api-access-snrpl\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150545 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-serving-cert\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150561 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3ea797e4-54e0-4063-8d2b-647f6686e2a8-etcd-client\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150579 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-service-ca\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150599 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-879pl\" (UniqueName: \"kubernetes.io/projected/4c36b00a-bd3f-424c-a67b-d828d782e60f-kube-api-access-879pl\") pod \"downloads-7954f5f757-w4nl5\" (UID: \"4c36b00a-bd3f-424c-a67b-d828d782e60f\") " pod="openshift-console/downloads-7954f5f757-w4nl5" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150617 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1f4b64e-0bcd-420e-a4d6-80918348ed75-auth-proxy-config\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150634 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150651 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-audit\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150674 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-trusted-ca-bundle\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150694 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-service-ca-bundle\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150709 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3ea797e4-54e0-4063-8d2b-647f6686e2a8-encryption-config\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150726 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ea797e4-54e0-4063-8d2b-647f6686e2a8-serving-cert\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150746 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-oauth-serving-cert\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150769 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150793 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s664q\" (UniqueName: \"kubernetes.io/projected/88e84359-a2f8-4d55-96e4-fda2ff226372-kube-api-access-s664q\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.151321 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.151523 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.151536 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6318da7c-2891-47a5-bebf-edd3da5a103a-etcd-ca\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.151730 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.151867 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.151976 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.152000 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-config\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.152083 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.152495 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88e84359-a2f8-4d55-96e4-fda2ff226372-audit-policies\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.152612 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-oauth-config\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.168331 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.169629 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.170918 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6318da7c-2891-47a5-bebf-edd3da5a103a-serving-cert\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.171344 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggf6k"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.171866 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fc9b807-491e-4540-80ba-dd9243fa514c-metrics-tls\") pod \"dns-operator-744455d44c-bpzqw\" (UID: \"7fc9b807-491e-4540-80ba-dd9243fa514c\") " pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.172164 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88e84359-a2f8-4d55-96e4-fda2ff226372-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.172342 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-serving-cert\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.172563 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.173386 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.176068 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.189431 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.190045 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b1f4b64e-0bcd-420e-a4d6-80918348ed75-machine-approver-tls\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.191389 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.191618 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0593410f-5966-4c13-9978-dbb0dee5faab-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fw7rw\" (UID: \"0593410f-5966-4c13-9978-dbb0dee5faab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.191879 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.192156 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-oauth-serving-cert\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.174612 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88e84359-a2f8-4d55-96e4-fda2ff226372-audit-dir\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.192233 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.192389 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-kbpk6"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.192722 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1f4b64e-0bcd-420e-a4d6-80918348ed75-auth-proxy-config\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.192736 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.193212 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bstw9"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.193591 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.193920 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.197469 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/88e84359-a2f8-4d55-96e4-fda2ff226372-encryption-config\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.197937 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.198171 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.198488 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.199508 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-service-ca\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.200018 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6318da7c-2891-47a5-bebf-edd3da5a103a-config\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.200092 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.200444 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.200746 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.201043 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.201367 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.203787 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.204676 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f4b64e-0bcd-420e-a4d6-80918348ed75-config\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.205885 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.208035 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-service-ca-bundle\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.209323 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.209574 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-trusted-ca-bundle\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.210273 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.210306 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.210491 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211945 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/88e84359-a2f8-4d55-96e4-fda2ff226372-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.213529 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.213942 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211451 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.215492 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211509 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211525 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211579 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211601 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211633 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211781 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211850 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211889 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211923 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211982 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.212016 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.212097 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.212211 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.212247 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.212339 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.212388 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.221714 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.222469 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6k2g8"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.222918 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.223177 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.223218 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.226740 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-serving-cert\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.224355 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.224911 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.225073 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dc5q\" (UniqueName: \"kubernetes.io/projected/0593410f-5966-4c13-9978-dbb0dee5faab-kube-api-access-5dc5q\") pod \"cluster-samples-operator-665b6dd947-fw7rw\" (UID: \"0593410f-5966-4c13-9978-dbb0dee5faab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.225718 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.225783 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.227036 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.226510 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.226602 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.227150 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.227248 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.223885 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/88e84359-a2f8-4d55-96e4-fda2ff226372-etcd-client\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.227392 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.227411 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.227572 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.227747 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.228103 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.227486 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.230483 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.230838 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.230881 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.231366 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bbjwp"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.231538 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.232035 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.232402 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.232879 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.233111 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.233814 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.234889 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6nrg\" (UniqueName: \"kubernetes.io/projected/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-kube-api-access-x6nrg\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.234907 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s664q\" (UniqueName: \"kubernetes.io/projected/88e84359-a2f8-4d55-96e4-fda2ff226372-kube-api-access-s664q\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.234953 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5sp6x"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.235623 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.236673 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.238261 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.260247 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.260497 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snrpl\" (UniqueName: \"kubernetes.io/projected/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-kube-api-access-snrpl\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.261605 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf85x\" (UniqueName: \"kubernetes.io/projected/7fc9b807-491e-4540-80ba-dd9243fa514c-kube-api-access-qf85x\") pod \"dns-operator-744455d44c-bpzqw\" (UID: \"7fc9b807-491e-4540-80ba-dd9243fa514c\") " pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.262058 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.262142 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88e84359-a2f8-4d55-96e4-fda2ff226372-serving-cert\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.262878 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kp4p\" (UniqueName: \"kubernetes.io/projected/b1f4b64e-0bcd-420e-a4d6-80918348ed75-kube-api-access-6kp4p\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264000 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6318da7c-2891-47a5-bebf-edd3da5a103a-etcd-client\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264617 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-etcd-serving-ca\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264666 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-config\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264692 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/96df7f4c-b782-43e2-99b2-fa5219a59fd9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tzzkm\" (UID: \"96df7f4c-b782-43e2-99b2-fa5219a59fd9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264710 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l4xc\" (UniqueName: \"kubernetes.io/projected/96df7f4c-b782-43e2-99b2-fa5219a59fd9-kube-api-access-8l4xc\") pod \"machine-config-controller-84d6567774-tzzkm\" (UID: \"96df7f4c-b782-43e2-99b2-fa5219a59fd9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264745 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3ea797e4-54e0-4063-8d2b-647f6686e2a8-etcd-client\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264767 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfb5c679-7c23-47fe-92b2-e035dceef1be-serving-cert\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264816 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-images\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264846 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-audit\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264867 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264890 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ea797e4-54e0-4063-8d2b-647f6686e2a8-serving-cert\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264911 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3ea797e4-54e0-4063-8d2b-647f6686e2a8-encryption-config\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264935 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bfb5c679-7c23-47fe-92b2-e035dceef1be-trusted-ca\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264956 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vf78\" (UniqueName: \"kubernetes.io/projected/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-kube-api-access-2vf78\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264991 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km45n\" (UniqueName: \"kubernetes.io/projected/bfb5c679-7c23-47fe-92b2-e035dceef1be-kube-api-access-km45n\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.265015 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3ea797e4-54e0-4063-8d2b-647f6686e2a8-audit-dir\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.265033 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-config\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.265054 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfb5c679-7c23-47fe-92b2-e035dceef1be-config\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.265074 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/96df7f4c-b782-43e2-99b2-fa5219a59fd9-proxy-tls\") pod \"machine-config-controller-84d6567774-tzzkm\" (UID: \"96df7f4c-b782-43e2-99b2-fa5219a59fd9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.265095 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-image-import-ca\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.265141 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.265159 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3ea797e4-54e0-4063-8d2b-647f6686e2a8-node-pullsecrets\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.265195 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggj9b\" (UniqueName: \"kubernetes.io/projected/3ea797e4-54e0-4063-8d2b-647f6686e2a8-kube-api-access-ggj9b\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.265882 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-etcd-serving-ca\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.266665 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq22q\" (UniqueName: \"kubernetes.io/projected/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-kube-api-access-xq22q\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.267913 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-audit\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.268160 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3ea797e4-54e0-4063-8d2b-647f6686e2a8-audit-dir\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.268683 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-config\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.270112 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-image-import-ca\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.270253 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3ea797e4-54e0-4063-8d2b-647f6686e2a8-node-pullsecrets\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.270803 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.271848 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.272021 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.272293 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.272989 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.273095 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.274944 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3ea797e4-54e0-4063-8d2b-647f6686e2a8-encryption-config\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.275820 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3ea797e4-54e0-4063-8d2b-647f6686e2a8-etcd-client\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.275856 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.278057 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ea797e4-54e0-4063-8d2b-647f6686e2a8-serving-cert\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.279149 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v8xf8"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.281475 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-w4nl5"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.283389 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.287000 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.288409 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-spfls"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.289392 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.290386 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-h48zc"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.291614 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.291782 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.293088 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mcszv"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.299794 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.301238 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j8ggj"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.301455 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.302576 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.304276 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.304922 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.305911 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-46w22"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.307800 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.308111 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.310985 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-tz5vz"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.312606 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.316541 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6k2g8"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.317723 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bstw9"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.319944 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bpzqw"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.321007 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.321384 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.324278 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-td8n5"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.324898 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-td8n5" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.325225 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vwjsv"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.329310 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.331036 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.331080 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.331096 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5sp6x"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.331110 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.333370 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggf6k"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.333409 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.334355 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.335361 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-td8n5"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.336649 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-d6mxf"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.337321 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.337746 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bbjwp"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.338053 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.338762 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-ssf69"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.339549 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-ssf69" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.339774 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.340945 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.341675 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.342439 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.344121 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vwjsv"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.344743 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.345067 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.346681 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.351936 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-d6mxf"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.358708 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.362317 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.366649 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-config\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.366699 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/96df7f4c-b782-43e2-99b2-fa5219a59fd9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tzzkm\" (UID: \"96df7f4c-b782-43e2-99b2-fa5219a59fd9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.366727 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l4xc\" (UniqueName: \"kubernetes.io/projected/96df7f4c-b782-43e2-99b2-fa5219a59fd9-kube-api-access-8l4xc\") pod \"machine-config-controller-84d6567774-tzzkm\" (UID: \"96df7f4c-b782-43e2-99b2-fa5219a59fd9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.366762 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfb5c679-7c23-47fe-92b2-e035dceef1be-serving-cert\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.366782 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-images\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.366841 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bfb5c679-7c23-47fe-92b2-e035dceef1be-trusted-ca\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.366867 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vf78\" (UniqueName: \"kubernetes.io/projected/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-kube-api-access-2vf78\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.366897 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km45n\" (UniqueName: \"kubernetes.io/projected/bfb5c679-7c23-47fe-92b2-e035dceef1be-kube-api-access-km45n\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.366919 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfb5c679-7c23-47fe-92b2-e035dceef1be-config\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.366938 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/96df7f4c-b782-43e2-99b2-fa5219a59fd9-proxy-tls\") pod \"machine-config-controller-84d6567774-tzzkm\" (UID: \"96df7f4c-b782-43e2-99b2-fa5219a59fd9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.366996 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.368600 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-images\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.368948 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/96df7f4c-b782-43e2-99b2-fa5219a59fd9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tzzkm\" (UID: \"96df7f4c-b782-43e2-99b2-fa5219a59fd9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.369313 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-config\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.370038 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bfb5c679-7c23-47fe-92b2-e035dceef1be-trusted-ca\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.370423 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.370659 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfb5c679-7c23-47fe-92b2-e035dceef1be-config\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.374373 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfb5c679-7c23-47fe-92b2-e035dceef1be-serving-cert\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.380850 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.381127 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.402492 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.404640 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.423090 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: W0217 13:27:47.438124 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1f4b64e_0bcd_420e_a4d6_80918348ed75.slice/crio-3ee07fa6ba6d36720c035c7ba665fa7a060fc106c811bb69d327f7cc277caece WatchSource:0}: Error finding container 3ee07fa6ba6d36720c035c7ba665fa7a060fc106c811bb69d327f7cc277caece: Status 404 returned error can't find the container with id 3ee07fa6ba6d36720c035c7ba665fa7a060fc106c811bb69d327f7cc277caece Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.442984 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.461989 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.481955 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.518098 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.521566 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.583700 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmjld\" (UniqueName: \"kubernetes.io/projected/6318da7c-2891-47a5-bebf-edd3da5a103a-kube-api-access-cmjld\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.585770 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.586280 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-879pl\" (UniqueName: \"kubernetes.io/projected/4c36b00a-bd3f-424c-a67b-d828d782e60f-kube-api-access-879pl\") pod \"downloads-7954f5f757-w4nl5\" (UID: \"4c36b00a-bd3f-424c-a67b-d828d782e60f\") " pod="openshift-console/downloads-7954f5f757-w4nl5" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.587953 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv"] Feb 17 13:27:47 crc kubenswrapper[4804]: W0217 13:27:47.597971 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88e84359_a2f8_4d55_96e4_fda2ff226372.slice/crio-c9b9fab89f70bb50fd53e1da34a1ce026563ddf2c39c88b3643b42f5be6e5961 WatchSource:0}: Error finding container c9b9fab89f70bb50fd53e1da34a1ce026563ddf2c39c88b3643b42f5be6e5961: Status 404 returned error can't find the container with id c9b9fab89f70bb50fd53e1da34a1ce026563ddf2c39c88b3643b42f5be6e5961 Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.602651 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.621910 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.630986 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-w4nl5" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.641715 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.683801 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.692887 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.701896 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.740148 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bpzqw"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.751092 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.763006 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 13:27:47 crc kubenswrapper[4804]: W0217 13:27:47.764261 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fc9b807_491e_4540_80ba_dd9243fa514c.slice/crio-a742f2a13e217d53ca3eb2ba15cfeaa06ae352d164ec190b1277029399e33daf WatchSource:0}: Error finding container a742f2a13e217d53ca3eb2ba15cfeaa06ae352d164ec190b1277029399e33daf: Status 404 returned error can't find the container with id a742f2a13e217d53ca3eb2ba15cfeaa06ae352d164ec190b1277029399e33daf Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.782954 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.807300 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-w4nl5"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.809814 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 13:27:47 crc kubenswrapper[4804]: W0217 13:27:47.820406 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c36b00a_bd3f_424c_a67b_d828d782e60f.slice/crio-05aa1eba7052796e17cc8945ae7ffffe0694d784084a4b854dd8965a2ae59169 WatchSource:0}: Error finding container 05aa1eba7052796e17cc8945ae7ffffe0694d784084a4b854dd8965a2ae59169: Status 404 returned error can't find the container with id 05aa1eba7052796e17cc8945ae7ffffe0694d784084a4b854dd8965a2ae59169 Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.822090 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.840751 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" event={"ID":"7fc9b807-491e-4540-80ba-dd9243fa514c","Type":"ContainerStarted","Data":"a742f2a13e217d53ca3eb2ba15cfeaa06ae352d164ec190b1277029399e33daf"} Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.841936 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-w4nl5" event={"ID":"4c36b00a-bd3f-424c-a67b-d828d782e60f","Type":"ContainerStarted","Data":"05aa1eba7052796e17cc8945ae7ffffe0694d784084a4b854dd8965a2ae59169"} Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.843473 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" event={"ID":"88e84359-a2f8-4d55-96e4-fda2ff226372","Type":"ContainerStarted","Data":"c9b9fab89f70bb50fd53e1da34a1ce026563ddf2c39c88b3643b42f5be6e5961"} Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.843710 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.846388 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" event={"ID":"b1f4b64e-0bcd-420e-a4d6-80918348ed75","Type":"ContainerStarted","Data":"41bb1edb1f0afe3c11b4dbb133726a71c2b34b6e410a4e525ffe2969562e7d2e"} Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.846439 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" event={"ID":"b1f4b64e-0bcd-420e-a4d6-80918348ed75","Type":"ContainerStarted","Data":"3ee07fa6ba6d36720c035c7ba665fa7a060fc106c811bb69d327f7cc277caece"} Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.852312 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-tz5vz"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.862655 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: W0217 13:27:47.867862 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9eb6b4b9_9e2e_4f39_9df0_068cfea71701.slice/crio-981cd8ca6939145b19efcac42c0b745084dc50ef247139e74f5af40d78e085ba WatchSource:0}: Error finding container 981cd8ca6939145b19efcac42c0b745084dc50ef247139e74f5af40d78e085ba: Status 404 returned error can't find the container with id 981cd8ca6939145b19efcac42c0b745084dc50ef247139e74f5af40d78e085ba Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.869613 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.882106 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.900236 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-h48zc"] Feb 17 13:27:47 crc kubenswrapper[4804]: W0217 13:27:47.900930 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f4f3a44_0dd9_49eb_bbb8_ada255b278de.slice/crio-e9fafcd585fc94a3539aa0f9f68c6276c65ebc758dd93035fbffb61e4de93d6e WatchSource:0}: Error finding container e9fafcd585fc94a3539aa0f9f68c6276c65ebc758dd93035fbffb61e4de93d6e: Status 404 returned error can't find the container with id e9fafcd585fc94a3539aa0f9f68c6276c65ebc758dd93035fbffb61e4de93d6e Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.902097 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.910695 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v8xf8"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.922954 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.944503 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.944831 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.961542 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 13:27:47 crc kubenswrapper[4804]: W0217 13:27:47.976987 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a8b8b00_f9df_45f1_97af_4d88c02a4d98.slice/crio-18cccd51d77f45d9868a27c5b0569007979b1cdc38e98ebbbfc9ffa84164953e WatchSource:0}: Error finding container 18cccd51d77f45d9868a27c5b0569007979b1cdc38e98ebbbfc9ffa84164953e: Status 404 returned error can't find the container with id 18cccd51d77f45d9868a27c5b0569007979b1cdc38e98ebbbfc9ffa84164953e Feb 17 13:27:47 crc kubenswrapper[4804]: W0217 13:27:47.977312 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6318da7c_2891_47a5_bebf_edd3da5a103a.slice/crio-48e546de21858befead80836e9b675ee2762bcd85ecd8cb4fd55e46f87d16593 WatchSource:0}: Error finding container 48e546de21858befead80836e9b675ee2762bcd85ecd8cb4fd55e46f87d16593: Status 404 returned error can't find the container with id 48e546de21858befead80836e9b675ee2762bcd85ecd8cb4fd55e46f87d16593 Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.989207 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.002435 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.028695 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.042321 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.062590 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.082440 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.102263 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.127577 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.142604 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.162065 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.182110 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.202759 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.220458 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/96df7f4c-b782-43e2-99b2-fa5219a59fd9-proxy-tls\") pod \"machine-config-controller-84d6567774-tzzkm\" (UID: \"96df7f4c-b782-43e2-99b2-fa5219a59fd9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.222150 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.240289 4804 request.go:700] Waited for 1.014622436s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/secrets?fieldSelector=metadata.name%3Dmarketplace-operator-dockercfg-5nsgg&limit=500&resourceVersion=0 Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.242319 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.262330 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.288388 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.301705 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.322136 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.342455 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.362547 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.381872 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.403208 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.421932 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.441953 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.462374 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.482768 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.502343 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.522042 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.542757 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.562738 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.583812 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.603167 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.622757 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.642232 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.661526 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.682825 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.702428 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.722306 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.742235 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.772628 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.782632 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.801969 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.822814 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.841783 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.851479 4804 generic.go:334] "Generic (PLEG): container finished" podID="88e84359-a2f8-4d55-96e4-fda2ff226372" containerID="db840a682474c367b56e255ffdf3eb9d92165aa180bac672885d9c489c386a53" exitCode=0 Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.851606 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" event={"ID":"88e84359-a2f8-4d55-96e4-fda2ff226372","Type":"ContainerDied","Data":"db840a682474c367b56e255ffdf3eb9d92165aa180bac672885d9c489c386a53"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.863251 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.883434 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.884921 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" event={"ID":"8a8b8b00-f9df-45f1-97af-4d88c02a4d98","Type":"ContainerStarted","Data":"aae7fce8c275b283de609a3216bd0f9365d33571250507615ba1a3458f02bd7f"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.885158 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" event={"ID":"8a8b8b00-f9df-45f1-97af-4d88c02a4d98","Type":"ContainerStarted","Data":"18cccd51d77f45d9868a27c5b0569007979b1cdc38e98ebbbfc9ffa84164953e"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.886972 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" event={"ID":"6318da7c-2891-47a5-bebf-edd3da5a103a","Type":"ContainerStarted","Data":"7f15b20168e423fd201d1eeced63644c8d561aaf7c39309ce39a2fd78e85e8c6"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.887039 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" event={"ID":"6318da7c-2891-47a5-bebf-edd3da5a103a","Type":"ContainerStarted","Data":"48e546de21858befead80836e9b675ee2762bcd85ecd8cb4fd55e46f87d16593"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.889453 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" event={"ID":"b1f4b64e-0bcd-420e-a4d6-80918348ed75","Type":"ContainerStarted","Data":"82df64a60b5bf1a6b580ab38323b9d7b57eaf3e7ae18fd799114f6c9f29f9199"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.891447 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tz5vz" event={"ID":"9eb6b4b9-9e2e-4f39-9df0-068cfea71701","Type":"ContainerStarted","Data":"f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.891482 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tz5vz" event={"ID":"9eb6b4b9-9e2e-4f39-9df0-068cfea71701","Type":"ContainerStarted","Data":"981cd8ca6939145b19efcac42c0b745084dc50ef247139e74f5af40d78e085ba"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.893569 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-w4nl5" event={"ID":"4c36b00a-bd3f-424c-a67b-d828d782e60f","Type":"ContainerStarted","Data":"70c98ed9b17459ecce8c557560b6c3b5a2d9498e3d4a6a2641164154a5d9ebcd"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.894046 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-w4nl5" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.895694 4804 patch_prober.go:28] interesting pod/downloads-7954f5f757-w4nl5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.895761 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-w4nl5" podUID="4c36b00a-bd3f-424c-a67b-d828d782e60f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.896415 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" event={"ID":"0593410f-5966-4c13-9978-dbb0dee5faab","Type":"ContainerStarted","Data":"f6954b1b50d4c56f25c40c74efaa930fbef86c99f3816505b436f7f793732ff3"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.896466 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" event={"ID":"0593410f-5966-4c13-9978-dbb0dee5faab","Type":"ContainerStarted","Data":"89e9f8509daca8e8da2fceec2fdf5d9553ef99d017fe202811ed00db2b2a43f0"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.896490 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" event={"ID":"0593410f-5966-4c13-9978-dbb0dee5faab","Type":"ContainerStarted","Data":"c4cb842047cae58db9bab53fd5252ce21137be446d1b7b340d70b2900c7b8f95"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.899065 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" event={"ID":"7fc9b807-491e-4540-80ba-dd9243fa514c","Type":"ContainerStarted","Data":"fd0a18e263b6382f0a6ad77c2670c4636aeee3f39b9237950274312c8d97f3bc"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.899109 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" event={"ID":"7fc9b807-491e-4540-80ba-dd9243fa514c","Type":"ContainerStarted","Data":"2a80dc7e230ee0a74471b1e6d89c5f07eec8b9a58a7070d834ff59c4154c186b"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.901898 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.902065 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" event={"ID":"2f4f3a44-0dd9-49eb-bbb8-ada255b278de","Type":"ContainerStarted","Data":"1a91717e61ed2e777aee1447674239b7a349199d2d59382e934466af8cf2fdae"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.902111 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" event={"ID":"2f4f3a44-0dd9-49eb-bbb8-ada255b278de","Type":"ContainerStarted","Data":"e9fafcd585fc94a3539aa0f9f68c6276c65ebc758dd93035fbffb61e4de93d6e"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.942098 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.943720 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggj9b\" (UniqueName: \"kubernetes.io/projected/3ea797e4-54e0-4063-8d2b-647f6686e2a8-kube-api-access-ggj9b\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.962711 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.983135 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.002417 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.022289 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.047411 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.065379 4804 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.083686 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.104838 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.121730 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.141775 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.162546 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.182523 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.202293 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.204565 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.223420 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.257332 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l4xc\" (UniqueName: \"kubernetes.io/projected/96df7f4c-b782-43e2-99b2-fa5219a59fd9-kube-api-access-8l4xc\") pod \"machine-config-controller-84d6567774-tzzkm\" (UID: \"96df7f4c-b782-43e2-99b2-fa5219a59fd9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.262294 4804 request.go:700] Waited for 1.892694237s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/serviceaccounts/console-operator/token Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.281145 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km45n\" (UniqueName: \"kubernetes.io/projected/bfb5c679-7c23-47fe-92b2-e035dceef1be-kube-api-access-km45n\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.305546 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vf78\" (UniqueName: \"kubernetes.io/projected/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-kube-api-access-2vf78\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.539939 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: E0217 13:27:49.540860 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.0408037 +0000 UTC m=+144.152223077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.541767 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.543287 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.543392 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.576028 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-46w22"] Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.642257 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:49 crc kubenswrapper[4804]: E0217 13:27:49.642475 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.142432505 +0000 UTC m=+144.253851842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.642526 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/074c752f-fec1-4bd6-8773-596461ea288a-default-certificate\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.642575 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.642599 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9400eb64-255c-46c2-b6c6-39260e013e92-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vjbls\" (UID: \"9400eb64-255c-46c2-b6c6-39260e013e92\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.642654 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x554\" (UniqueName: \"kubernetes.io/projected/1d929eaa-807c-4809-8b8a-78c186418e71-kube-api-access-8x554\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.642705 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-config\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.642726 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-client-ca\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.642756 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9400eb64-255c-46c2-b6c6-39260e013e92-config\") pod \"kube-apiserver-operator-766d6c64bb-vjbls\" (UID: \"9400eb64-255c-46c2-b6c6-39260e013e92\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.646575 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.646649 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/70a41b60-6ec1-491d-9d3e-88758d91c45e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b8qc5\" (UID: \"70a41b60-6ec1-491d-9d3e-88758d91c45e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.646679 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-certificates\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.646709 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgt6h\" (UniqueName: \"kubernetes.io/projected/074c752f-fec1-4bd6-8773-596461ea288a-kube-api-access-rgt6h\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.646732 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: E0217 13:27:49.646856 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.146831702 +0000 UTC m=+144.258251189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.647276 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70a41b60-6ec1-491d-9d3e-88758d91c45e-serving-cert\") pod \"openshift-config-operator-7777fb866f-b8qc5\" (UID: \"70a41b60-6ec1-491d-9d3e-88758d91c45e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.647327 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp6h2\" (UniqueName: \"kubernetes.io/projected/bd4df830-6ec9-4f4d-860e-f97af3088371-kube-api-access-pp6h2\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.647354 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78dad77c-6d3f-43bc-93a3-ecd7dce378f3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-gffmb\" (UID: \"78dad77c-6d3f-43bc-93a3-ecd7dce378f3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.647378 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/faba1ad1-aeda-412d-9824-36cc045bab86-serving-cert\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.647403 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfbln\" (UniqueName: \"kubernetes.io/projected/81f879fe-7bd1-42d0-b026-80f901641a0b-kube-api-access-vfbln\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.647442 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-trusted-ca\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.647464 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78dad77c-6d3f-43bc-93a3-ecd7dce378f3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-gffmb\" (UID: \"78dad77c-6d3f-43bc-93a3-ecd7dce378f3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649374 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cde5d02-8e0d-4b24-b7bc-b9365013d942-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d5pqr\" (UID: \"7cde5d02-8e0d-4b24-b7bc-b9365013d942\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649419 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwgww\" (UniqueName: \"kubernetes.io/projected/6f8789cf-f788-4c81-9624-532aa823de1c-kube-api-access-cwgww\") pod \"openshift-controller-manager-operator-756b6f6bc6-4shqj\" (UID: \"6f8789cf-f788-4c81-9624-532aa823de1c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649460 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d929eaa-807c-4809-8b8a-78c186418e71-serving-cert\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649486 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-dir\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649509 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649582 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c3cd53a-4a82-449d-a270-b41853fa2c8a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9t2lx\" (UID: \"4c3cd53a-4a82-449d-a270-b41853fa2c8a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649608 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649648 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9400eb64-255c-46c2-b6c6-39260e013e92-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vjbls\" (UID: \"9400eb64-255c-46c2-b6c6-39260e013e92\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649670 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/074c752f-fec1-4bd6-8773-596461ea288a-stats-auth\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649691 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd4df830-6ec9-4f4d-860e-f97af3088371-trusted-ca\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649713 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd4df830-6ec9-4f4d-860e-f97af3088371-bound-sa-token\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649737 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c3cd53a-4a82-449d-a270-b41853fa2c8a-config\") pod \"kube-controller-manager-operator-78b949d7b-9t2lx\" (UID: \"4c3cd53a-4a82-449d-a270-b41853fa2c8a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649777 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c3cd53a-4a82-449d-a270-b41853fa2c8a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9t2lx\" (UID: \"4c3cd53a-4a82-449d-a270-b41853fa2c8a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649797 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-config\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649840 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f8789cf-f788-4c81-9624-532aa823de1c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4shqj\" (UID: \"6f8789cf-f788-4c81-9624-532aa823de1c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649861 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649882 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7dtm\" (UniqueName: \"kubernetes.io/projected/78dad77c-6d3f-43bc-93a3-ecd7dce378f3-kube-api-access-k7dtm\") pod \"openshift-apiserver-operator-796bbdcf4f-gffmb\" (UID: \"78dad77c-6d3f-43bc-93a3-ecd7dce378f3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649903 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650146 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b09fea83-e0d3-4a40-b186-8432c3fa7be0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650227 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b09fea83-e0d3-4a40-b186-8432c3fa7be0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650251 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bd4df830-6ec9-4f4d-860e-f97af3088371-metrics-tls\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650274 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/074c752f-fec1-4bd6-8773-596461ea288a-service-ca-bundle\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650378 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650399 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-tls\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650441 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-policies\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650467 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqx9v\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-kube-api-access-cqx9v\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650525 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wz5b\" (UniqueName: \"kubernetes.io/projected/faba1ad1-aeda-412d-9824-36cc045bab86-kube-api-access-2wz5b\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650548 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650587 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/074c752f-fec1-4bd6-8773-596461ea288a-metrics-certs\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650612 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cde5d02-8e0d-4b24-b7bc-b9365013d942-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d5pqr\" (UID: \"7cde5d02-8e0d-4b24-b7bc-b9365013d942\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650632 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f8789cf-f788-4c81-9624-532aa823de1c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4shqj\" (UID: \"6f8789cf-f788-4c81-9624-532aa823de1c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650656 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n95dt\" (UniqueName: \"kubernetes.io/projected/70a41b60-6ec1-491d-9d3e-88758d91c45e-kube-api-access-n95dt\") pod \"openshift-config-operator-7777fb866f-b8qc5\" (UID: \"70a41b60-6ec1-491d-9d3e-88758d91c45e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650697 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650738 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-client-ca\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.651549 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.651604 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cde5d02-8e0d-4b24-b7bc-b9365013d942-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d5pqr\" (UID: \"7cde5d02-8e0d-4b24-b7bc-b9365013d942\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.651624 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.651659 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.651684 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-bound-sa-token\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.752670 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:49 crc kubenswrapper[4804]: E0217 13:27:49.752758 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.252740519 +0000 UTC m=+144.364159856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753348 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwgww\" (UniqueName: \"kubernetes.io/projected/6f8789cf-f788-4c81-9624-532aa823de1c-kube-api-access-cwgww\") pod \"openshift-controller-manager-operator-756b6f6bc6-4shqj\" (UID: \"6f8789cf-f788-4c81-9624-532aa823de1c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753386 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3768c453-c58d-4768-9620-a202cbb8ccd8-secret-volume\") pod \"collect-profiles-29522235-lnbg8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753456 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d929eaa-807c-4809-8b8a-78c186418e71-serving-cert\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753481 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h988v\" (UniqueName: \"kubernetes.io/projected/c36c8731-9ee6-4ce6-8708-9e35e6112804-kube-api-access-h988v\") pod \"dns-default-d6mxf\" (UID: \"c36c8731-9ee6-4ce6-8708-9e35e6112804\") " pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753502 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6-signing-cabundle\") pod \"service-ca-9c57cc56f-5sp6x\" (UID: \"fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753539 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-dir\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753560 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753584 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-228st\" (UniqueName: \"kubernetes.io/projected/527ee9be-17be-4352-86fc-ef31bece3e86-kube-api-access-228st\") pod \"multus-admission-controller-857f4d67dd-bbjwp\" (UID: \"527ee9be-17be-4352-86fc-ef31bece3e86\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753611 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c3cd53a-4a82-449d-a270-b41853fa2c8a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9t2lx\" (UID: \"4c3cd53a-4a82-449d-a270-b41853fa2c8a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753644 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753670 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c98dfab-f166-4eb4-b385-724d6b9b9d7a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-t4m4g\" (UID: \"6c98dfab-f166-4eb4-b385-724d6b9b9d7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753706 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c36c8731-9ee6-4ce6-8708-9e35e6112804-metrics-tls\") pod \"dns-default-d6mxf\" (UID: \"c36c8731-9ee6-4ce6-8708-9e35e6112804\") " pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753731 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9400eb64-255c-46c2-b6c6-39260e013e92-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vjbls\" (UID: \"9400eb64-255c-46c2-b6c6-39260e013e92\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753754 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/074c752f-fec1-4bd6-8773-596461ea288a-stats-auth\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753778 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd4df830-6ec9-4f4d-860e-f97af3088371-trusted-ca\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753801 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd4df830-6ec9-4f4d-860e-f97af3088371-bound-sa-token\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753823 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c3cd53a-4a82-449d-a270-b41853fa2c8a-config\") pod \"kube-controller-manager-operator-78b949d7b-9t2lx\" (UID: \"4c3cd53a-4a82-449d-a270-b41853fa2c8a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753844 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c3cd53a-4a82-449d-a270-b41853fa2c8a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9t2lx\" (UID: \"4c3cd53a-4a82-449d-a270-b41853fa2c8a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753868 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-config\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753893 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4gxw\" (UniqueName: \"kubernetes.io/projected/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-kube-api-access-x4gxw\") pod \"marketplace-operator-79b997595-6k2g8\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753920 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fd233b99-2205-4e95-ba04-232015517afb-profile-collector-cert\") pod \"catalog-operator-68c6474976-645bx\" (UID: \"fd233b99-2205-4e95-ba04-232015517afb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753943 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f8789cf-f788-4c81-9624-532aa823de1c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4shqj\" (UID: \"6f8789cf-f788-4c81-9624-532aa823de1c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753964 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753987 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7dtm\" (UniqueName: \"kubernetes.io/projected/78dad77c-6d3f-43bc-93a3-ecd7dce378f3-kube-api-access-k7dtm\") pod \"openshift-apiserver-operator-796bbdcf4f-gffmb\" (UID: \"78dad77c-6d3f-43bc-93a3-ecd7dce378f3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754009 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a5192d8-6708-48c6-b5e5-a081f89d3e66-config\") pod \"service-ca-operator-777779d784-ttbrn\" (UID: \"4a5192d8-6708-48c6-b5e5-a081f89d3e66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754031 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754067 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1975682c-3445-467d-a0bd-a87b0ebf604b-certs\") pod \"machine-config-server-ssf69\" (UID: \"1975682c-3445-467d-a0bd-a87b0ebf604b\") " pod="openshift-machine-config-operator/machine-config-server-ssf69" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754383 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-registration-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754406 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6-signing-key\") pod \"service-ca-9c57cc56f-5sp6x\" (UID: \"fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754447 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dzxb\" (UniqueName: \"kubernetes.io/projected/1975682c-3445-467d-a0bd-a87b0ebf604b-kube-api-access-4dzxb\") pod \"machine-config-server-ssf69\" (UID: \"1975682c-3445-467d-a0bd-a87b0ebf604b\") " pod="openshift-machine-config-operator/machine-config-server-ssf69" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754471 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/360a1093-b581-4806-9f88-3d3907bd4895-apiservice-cert\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754496 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b09fea83-e0d3-4a40-b186-8432c3fa7be0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754522 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b09fea83-e0d3-4a40-b186-8432c3fa7be0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754546 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5dmh\" (UniqueName: \"kubernetes.io/projected/81a4453c-e1e8-4624-a19b-f08ec4df93d7-kube-api-access-j5dmh\") pod \"olm-operator-6b444d44fb-vmmzq\" (UID: \"81a4453c-e1e8-4624-a19b-f08ec4df93d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754570 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z9lv\" (UniqueName: \"kubernetes.io/projected/3768c453-c58d-4768-9620-a202cbb8ccd8-kube-api-access-7z9lv\") pod \"collect-profiles-29522235-lnbg8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754594 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bd4df830-6ec9-4f4d-860e-f97af3088371-metrics-tls\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754630 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/074c752f-fec1-4bd6-8773-596461ea288a-service-ca-bundle\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754655 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/81a4453c-e1e8-4624-a19b-f08ec4df93d7-srv-cert\") pod \"olm-operator-6b444d44fb-vmmzq\" (UID: \"81a4453c-e1e8-4624-a19b-f08ec4df93d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754683 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754711 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-tls\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754735 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spwnl\" (UniqueName: \"kubernetes.io/projected/360a1093-b581-4806-9f88-3d3907bd4895-kube-api-access-spwnl\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754757 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e609565-a380-48f1-9b14-542a17c4ea50-cert\") pod \"ingress-canary-td8n5\" (UID: \"8e609565-a380-48f1-9b14-542a17c4ea50\") " pod="openshift-ingress-canary/ingress-canary-td8n5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754779 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6882836-eb39-412c-a0d6-4906c9be9b89-proxy-tls\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754813 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6k2g8\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754836 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e27ee8-4574-4731-9324-031f9b3a209f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-82b8f\" (UID: \"28e27ee8-4574-4731-9324-031f9b3a209f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.755084 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a6882836-eb39-412c-a0d6-4906c9be9b89-auth-proxy-config\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.757644 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/074c752f-fec1-4bd6-8773-596461ea288a-service-ca-bundle\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.758606 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f8789cf-f788-4c81-9624-532aa823de1c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4shqj\" (UID: \"6f8789cf-f788-4c81-9624-532aa823de1c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.758780 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c3cd53a-4a82-449d-a270-b41853fa2c8a-config\") pod \"kube-controller-manager-operator-78b949d7b-9t2lx\" (UID: \"4c3cd53a-4a82-449d-a270-b41853fa2c8a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.758967 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.759158 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b09fea83-e0d3-4a40-b186-8432c3fa7be0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.759825 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-config\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.759823 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-dir\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.760474 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-policies\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.760655 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd4df830-6ec9-4f4d-860e-f97af3088371-trusted-ca\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.760759 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqx9v\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-kube-api-access-cqx9v\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.760810 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2aaa28d2-1ca6-42c3-98f7-58c644a03061-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6xkvs\" (UID: \"2aaa28d2-1ca6-42c3-98f7-58c644a03061\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.760904 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wz5b\" (UniqueName: \"kubernetes.io/projected/faba1ad1-aeda-412d-9824-36cc045bab86-kube-api-access-2wz5b\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.760924 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bd4df830-6ec9-4f4d-860e-f97af3088371-metrics-tls\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761085 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761115 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/360a1093-b581-4806-9f88-3d3907bd4895-webhook-cert\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761141 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-policies\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761238 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/074c752f-fec1-4bd6-8773-596461ea288a-metrics-certs\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761285 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cde5d02-8e0d-4b24-b7bc-b9365013d942-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d5pqr\" (UID: \"7cde5d02-8e0d-4b24-b7bc-b9365013d942\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761305 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c36c8731-9ee6-4ce6-8708-9e35e6112804-config-volume\") pod \"dns-default-d6mxf\" (UID: \"c36c8731-9ee6-4ce6-8708-9e35e6112804\") " pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761416 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f8789cf-f788-4c81-9624-532aa823de1c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4shqj\" (UID: \"6f8789cf-f788-4c81-9624-532aa823de1c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761436 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n95dt\" (UniqueName: \"kubernetes.io/projected/70a41b60-6ec1-491d-9d3e-88758d91c45e-kube-api-access-n95dt\") pod \"openshift-config-operator-7777fb866f-b8qc5\" (UID: \"70a41b60-6ec1-491d-9d3e-88758d91c45e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761453 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761507 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-client-ca\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761525 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761548 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6k2g8\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761603 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cde5d02-8e0d-4b24-b7bc-b9365013d942-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d5pqr\" (UID: \"7cde5d02-8e0d-4b24-b7bc-b9365013d942\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761634 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761651 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/527ee9be-17be-4352-86fc-ef31bece3e86-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bbjwp\" (UID: \"527ee9be-17be-4352-86fc-ef31bece3e86\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761676 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-bound-sa-token\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761692 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761870 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxd8k\" (UniqueName: \"kubernetes.io/projected/28e27ee8-4574-4731-9324-031f9b3a209f-kube-api-access-kxd8k\") pod \"kube-storage-version-migrator-operator-b67b599dd-82b8f\" (UID: \"28e27ee8-4574-4731-9324-031f9b3a209f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761904 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/81a4453c-e1e8-4624-a19b-f08ec4df93d7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vmmzq\" (UID: \"81a4453c-e1e8-4624-a19b-f08ec4df93d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761947 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/074c752f-fec1-4bd6-8773-596461ea288a-default-certificate\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761978 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761996 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxzwc\" (UniqueName: \"kubernetes.io/projected/4a5192d8-6708-48c6-b5e5-a081f89d3e66-kube-api-access-gxzwc\") pod \"service-ca-operator-777779d784-ttbrn\" (UID: \"4a5192d8-6708-48c6-b5e5-a081f89d3e66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" Feb 17 13:27:49 crc kubenswrapper[4804]: E0217 13:27:49.762258 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.262246258 +0000 UTC m=+144.373665595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.763811 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.770247 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/074c752f-fec1-4bd6-8773-596461ea288a-stats-auth\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.770657 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.770686 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.771011 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f8789cf-f788-4c81-9624-532aa823de1c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4shqj\" (UID: \"6f8789cf-f788-4c81-9624-532aa823de1c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.771350 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.771477 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9400eb64-255c-46c2-b6c6-39260e013e92-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vjbls\" (UID: \"9400eb64-255c-46c2-b6c6-39260e013e92\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.771523 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcjhr\" (UniqueName: \"kubernetes.io/projected/2aaa28d2-1ca6-42c3-98f7-58c644a03061-kube-api-access-wcjhr\") pod \"package-server-manager-789f6589d5-6xkvs\" (UID: \"2aaa28d2-1ca6-42c3-98f7-58c644a03061\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.771573 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-tls\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.771607 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.771788 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x554\" (UniqueName: \"kubernetes.io/projected/1d929eaa-807c-4809-8b8a-78c186418e71-kube-api-access-8x554\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.771926 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-config\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.772006 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-client-ca\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.772035 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnm4h\" (UniqueName: \"kubernetes.io/projected/ea50fe9b-465a-448b-97db-a91822afb720-kube-api-access-qnm4h\") pod \"migrator-59844c95c7-q46rz\" (UID: \"ea50fe9b-465a-448b-97db-a91822afb720\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773265 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-client-ca\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773345 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773361 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9400eb64-255c-46c2-b6c6-39260e013e92-config\") pod \"kube-apiserver-operator-766d6c64bb-vjbls\" (UID: \"9400eb64-255c-46c2-b6c6-39260e013e92\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773409 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7mtv\" (UniqueName: \"kubernetes.io/projected/a6882836-eb39-412c-a0d6-4906c9be9b89-kube-api-access-l7mtv\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773427 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-config\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773539 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a5192d8-6708-48c6-b5e5-a081f89d3e66-serving-cert\") pod \"service-ca-operator-777779d784-ttbrn\" (UID: \"4a5192d8-6708-48c6-b5e5-a081f89d3e66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773568 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3768c453-c58d-4768-9620-a202cbb8ccd8-config-volume\") pod \"collect-profiles-29522235-lnbg8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773614 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773640 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-csi-data-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773664 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/360a1093-b581-4806-9f88-3d3907bd4895-tmpfs\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773732 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-certificates\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773754 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/70a41b60-6ec1-491d-9d3e-88758d91c45e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b8qc5\" (UID: \"70a41b60-6ec1-491d-9d3e-88758d91c45e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773805 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54d9r\" (UniqueName: \"kubernetes.io/projected/fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6-kube-api-access-54d9r\") pod \"service-ca-9c57cc56f-5sp6x\" (UID: \"fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773872 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgt6h\" (UniqueName: \"kubernetes.io/projected/074c752f-fec1-4bd6-8773-596461ea288a-kube-api-access-rgt6h\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773888 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9400eb64-255c-46c2-b6c6-39260e013e92-config\") pod \"kube-apiserver-operator-766d6c64bb-vjbls\" (UID: \"9400eb64-255c-46c2-b6c6-39260e013e92\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773894 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773955 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1975682c-3445-467d-a0bd-a87b0ebf604b-node-bootstrap-token\") pod \"machine-config-server-ssf69\" (UID: \"1975682c-3445-467d-a0bd-a87b0ebf604b\") " pod="openshift-machine-config-operator/machine-config-server-ssf69" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773975 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a6882836-eb39-412c-a0d6-4906c9be9b89-images\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774008 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-socket-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774256 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70a41b60-6ec1-491d-9d3e-88758d91c45e-serving-cert\") pod \"openshift-config-operator-7777fb866f-b8qc5\" (UID: \"70a41b60-6ec1-491d-9d3e-88758d91c45e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774284 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx9tr\" (UniqueName: \"kubernetes.io/projected/6c98dfab-f166-4eb4-b385-724d6b9b9d7a-kube-api-access-kx9tr\") pod \"control-plane-machine-set-operator-78cbb6b69f-t4m4g\" (UID: \"6c98dfab-f166-4eb4-b385-724d6b9b9d7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774351 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774605 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp6h2\" (UniqueName: \"kubernetes.io/projected/bd4df830-6ec9-4f4d-860e-f97af3088371-kube-api-access-pp6h2\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774638 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78dad77c-6d3f-43bc-93a3-ecd7dce378f3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-gffmb\" (UID: \"78dad77c-6d3f-43bc-93a3-ecd7dce378f3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774669 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-mountpoint-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774745 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/faba1ad1-aeda-412d-9824-36cc045bab86-serving-cert\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774764 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfbln\" (UniqueName: \"kubernetes.io/projected/81f879fe-7bd1-42d0-b026-80f901641a0b-kube-api-access-vfbln\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774783 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fd233b99-2205-4e95-ba04-232015517afb-srv-cert\") pod \"catalog-operator-68c6474976-645bx\" (UID: \"fd233b99-2205-4e95-ba04-232015517afb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774800 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28e27ee8-4574-4731-9324-031f9b3a209f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-82b8f\" (UID: \"28e27ee8-4574-4731-9324-031f9b3a209f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774803 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/70a41b60-6ec1-491d-9d3e-88758d91c45e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b8qc5\" (UID: \"70a41b60-6ec1-491d-9d3e-88758d91c45e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774822 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-trusted-ca\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.775691 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78dad77c-6d3f-43bc-93a3-ecd7dce378f3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-gffmb\" (UID: \"78dad77c-6d3f-43bc-93a3-ecd7dce378f3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.775726 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-plugins-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.775789 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cde5d02-8e0d-4b24-b7bc-b9365013d942-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d5pqr\" (UID: \"7cde5d02-8e0d-4b24-b7bc-b9365013d942\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.775815 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khbwg\" (UniqueName: \"kubernetes.io/projected/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-kube-api-access-khbwg\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.775838 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4s62\" (UniqueName: \"kubernetes.io/projected/fd233b99-2205-4e95-ba04-232015517afb-kube-api-access-g4s62\") pod \"catalog-operator-68c6474976-645bx\" (UID: \"fd233b99-2205-4e95-ba04-232015517afb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.775915 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k8dn\" (UniqueName: \"kubernetes.io/projected/8e609565-a380-48f1-9b14-542a17c4ea50-kube-api-access-2k8dn\") pod \"ingress-canary-td8n5\" (UID: \"8e609565-a380-48f1-9b14-542a17c4ea50\") " pod="openshift-ingress-canary/ingress-canary-td8n5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.776041 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.776058 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b09fea83-e0d3-4a40-b186-8432c3fa7be0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.776980 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9400eb64-255c-46c2-b6c6-39260e013e92-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vjbls\" (UID: \"9400eb64-255c-46c2-b6c6-39260e013e92\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.777330 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.777752 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cde5d02-8e0d-4b24-b7bc-b9365013d942-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d5pqr\" (UID: \"7cde5d02-8e0d-4b24-b7bc-b9365013d942\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.778052 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.778672 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-trusted-ca\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.779937 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cde5d02-8e0d-4b24-b7bc-b9365013d942-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d5pqr\" (UID: \"7cde5d02-8e0d-4b24-b7bc-b9365013d942\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.782340 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70a41b60-6ec1-491d-9d3e-88758d91c45e-serving-cert\") pod \"openshift-config-operator-7777fb866f-b8qc5\" (UID: \"70a41b60-6ec1-491d-9d3e-88758d91c45e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.782881 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-certificates\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.782949 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/faba1ad1-aeda-412d-9824-36cc045bab86-serving-cert\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.783494 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/074c752f-fec1-4bd6-8773-596461ea288a-default-certificate\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.784371 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d929eaa-807c-4809-8b8a-78c186418e71-serving-cert\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.784725 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78dad77c-6d3f-43bc-93a3-ecd7dce378f3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-gffmb\" (UID: \"78dad77c-6d3f-43bc-93a3-ecd7dce378f3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.785443 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/074c752f-fec1-4bd6-8773-596461ea288a-metrics-certs\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.788023 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78dad77c-6d3f-43bc-93a3-ecd7dce378f3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-gffmb\" (UID: \"78dad77c-6d3f-43bc-93a3-ecd7dce378f3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.799641 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwgww\" (UniqueName: \"kubernetes.io/projected/6f8789cf-f788-4c81-9624-532aa823de1c-kube-api-access-cwgww\") pod \"openshift-controller-manager-operator-756b6f6bc6-4shqj\" (UID: \"6f8789cf-f788-4c81-9624-532aa823de1c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.800467 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-client-ca\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.801710 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.811525 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c3cd53a-4a82-449d-a270-b41853fa2c8a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9t2lx\" (UID: \"4c3cd53a-4a82-449d-a270-b41853fa2c8a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.819290 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd4df830-6ec9-4f4d-860e-f97af3088371-bound-sa-token\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.838920 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c3cd53a-4a82-449d-a270-b41853fa2c8a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9t2lx\" (UID: \"4c3cd53a-4a82-449d-a270-b41853fa2c8a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.859484 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7dtm\" (UniqueName: \"kubernetes.io/projected/78dad77c-6d3f-43bc-93a3-ecd7dce378f3-kube-api-access-k7dtm\") pod \"openshift-apiserver-operator-796bbdcf4f-gffmb\" (UID: \"78dad77c-6d3f-43bc-93a3-ecd7dce378f3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.876953 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877111 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wz5b\" (UniqueName: \"kubernetes.io/projected/faba1ad1-aeda-412d-9824-36cc045bab86-kube-api-access-2wz5b\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877116 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnm4h\" (UniqueName: \"kubernetes.io/projected/ea50fe9b-465a-448b-97db-a91822afb720-kube-api-access-qnm4h\") pod \"migrator-59844c95c7-q46rz\" (UID: \"ea50fe9b-465a-448b-97db-a91822afb720\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877159 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7mtv\" (UniqueName: \"kubernetes.io/projected/a6882836-eb39-412c-a0d6-4906c9be9b89-kube-api-access-l7mtv\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877178 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a5192d8-6708-48c6-b5e5-a081f89d3e66-serving-cert\") pod \"service-ca-operator-777779d784-ttbrn\" (UID: \"4a5192d8-6708-48c6-b5e5-a081f89d3e66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877212 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3768c453-c58d-4768-9620-a202cbb8ccd8-config-volume\") pod \"collect-profiles-29522235-lnbg8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877239 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-csi-data-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877260 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/360a1093-b581-4806-9f88-3d3907bd4895-tmpfs\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:49 crc kubenswrapper[4804]: E0217 13:27:49.877289 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.377277261 +0000 UTC m=+144.488696598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877307 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54d9r\" (UniqueName: \"kubernetes.io/projected/fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6-kube-api-access-54d9r\") pod \"service-ca-9c57cc56f-5sp6x\" (UID: \"fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877341 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1975682c-3445-467d-a0bd-a87b0ebf604b-node-bootstrap-token\") pod \"machine-config-server-ssf69\" (UID: \"1975682c-3445-467d-a0bd-a87b0ebf604b\") " pod="openshift-machine-config-operator/machine-config-server-ssf69" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877357 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a6882836-eb39-412c-a0d6-4906c9be9b89-images\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877377 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-socket-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877394 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx9tr\" (UniqueName: \"kubernetes.io/projected/6c98dfab-f166-4eb4-b385-724d6b9b9d7a-kube-api-access-kx9tr\") pod \"control-plane-machine-set-operator-78cbb6b69f-t4m4g\" (UID: \"6c98dfab-f166-4eb4-b385-724d6b9b9d7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877414 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-mountpoint-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877434 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fd233b99-2205-4e95-ba04-232015517afb-srv-cert\") pod \"catalog-operator-68c6474976-645bx\" (UID: \"fd233b99-2205-4e95-ba04-232015517afb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877451 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28e27ee8-4574-4731-9324-031f9b3a209f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-82b8f\" (UID: \"28e27ee8-4574-4731-9324-031f9b3a209f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877466 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-plugins-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877483 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khbwg\" (UniqueName: \"kubernetes.io/projected/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-kube-api-access-khbwg\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877498 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4s62\" (UniqueName: \"kubernetes.io/projected/fd233b99-2205-4e95-ba04-232015517afb-kube-api-access-g4s62\") pod \"catalog-operator-68c6474976-645bx\" (UID: \"fd233b99-2205-4e95-ba04-232015517afb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877514 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k8dn\" (UniqueName: \"kubernetes.io/projected/8e609565-a380-48f1-9b14-542a17c4ea50-kube-api-access-2k8dn\") pod \"ingress-canary-td8n5\" (UID: \"8e609565-a380-48f1-9b14-542a17c4ea50\") " pod="openshift-ingress-canary/ingress-canary-td8n5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877530 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3768c453-c58d-4768-9620-a202cbb8ccd8-secret-volume\") pod \"collect-profiles-29522235-lnbg8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877547 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h988v\" (UniqueName: \"kubernetes.io/projected/c36c8731-9ee6-4ce6-8708-9e35e6112804-kube-api-access-h988v\") pod \"dns-default-d6mxf\" (UID: \"c36c8731-9ee6-4ce6-8708-9e35e6112804\") " pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877561 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6-signing-cabundle\") pod \"service-ca-9c57cc56f-5sp6x\" (UID: \"fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877579 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-228st\" (UniqueName: \"kubernetes.io/projected/527ee9be-17be-4352-86fc-ef31bece3e86-kube-api-access-228st\") pod \"multus-admission-controller-857f4d67dd-bbjwp\" (UID: \"527ee9be-17be-4352-86fc-ef31bece3e86\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877594 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c36c8731-9ee6-4ce6-8708-9e35e6112804-metrics-tls\") pod \"dns-default-d6mxf\" (UID: \"c36c8731-9ee6-4ce6-8708-9e35e6112804\") " pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877611 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c98dfab-f166-4eb4-b385-724d6b9b9d7a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-t4m4g\" (UID: \"6c98dfab-f166-4eb4-b385-724d6b9b9d7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877631 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4gxw\" (UniqueName: \"kubernetes.io/projected/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-kube-api-access-x4gxw\") pod \"marketplace-operator-79b997595-6k2g8\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877653 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fd233b99-2205-4e95-ba04-232015517afb-profile-collector-cert\") pod \"catalog-operator-68c6474976-645bx\" (UID: \"fd233b99-2205-4e95-ba04-232015517afb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877675 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a5192d8-6708-48c6-b5e5-a081f89d3e66-config\") pod \"service-ca-operator-777779d784-ttbrn\" (UID: \"4a5192d8-6708-48c6-b5e5-a081f89d3e66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877697 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1975682c-3445-467d-a0bd-a87b0ebf604b-certs\") pod \"machine-config-server-ssf69\" (UID: \"1975682c-3445-467d-a0bd-a87b0ebf604b\") " pod="openshift-machine-config-operator/machine-config-server-ssf69" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877716 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6-signing-key\") pod \"service-ca-9c57cc56f-5sp6x\" (UID: \"fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877738 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-registration-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877770 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dzxb\" (UniqueName: \"kubernetes.io/projected/1975682c-3445-467d-a0bd-a87b0ebf604b-kube-api-access-4dzxb\") pod \"machine-config-server-ssf69\" (UID: \"1975682c-3445-467d-a0bd-a87b0ebf604b\") " pod="openshift-machine-config-operator/machine-config-server-ssf69" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877792 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/360a1093-b581-4806-9f88-3d3907bd4895-apiservice-cert\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877819 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5dmh\" (UniqueName: \"kubernetes.io/projected/81a4453c-e1e8-4624-a19b-f08ec4df93d7-kube-api-access-j5dmh\") pod \"olm-operator-6b444d44fb-vmmzq\" (UID: \"81a4453c-e1e8-4624-a19b-f08ec4df93d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877842 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z9lv\" (UniqueName: \"kubernetes.io/projected/3768c453-c58d-4768-9620-a202cbb8ccd8-kube-api-access-7z9lv\") pod \"collect-profiles-29522235-lnbg8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877862 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/81a4453c-e1e8-4624-a19b-f08ec4df93d7-srv-cert\") pod \"olm-operator-6b444d44fb-vmmzq\" (UID: \"81a4453c-e1e8-4624-a19b-f08ec4df93d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877884 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6882836-eb39-412c-a0d6-4906c9be9b89-proxy-tls\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877905 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spwnl\" (UniqueName: \"kubernetes.io/projected/360a1093-b581-4806-9f88-3d3907bd4895-kube-api-access-spwnl\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877925 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e609565-a380-48f1-9b14-542a17c4ea50-cert\") pod \"ingress-canary-td8n5\" (UID: \"8e609565-a380-48f1-9b14-542a17c4ea50\") " pod="openshift-ingress-canary/ingress-canary-td8n5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877947 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6k2g8\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877972 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e27ee8-4574-4731-9324-031f9b3a209f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-82b8f\" (UID: \"28e27ee8-4574-4731-9324-031f9b3a209f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877996 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a6882836-eb39-412c-a0d6-4906c9be9b89-auth-proxy-config\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878047 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2aaa28d2-1ca6-42c3-98f7-58c644a03061-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6xkvs\" (UID: \"2aaa28d2-1ca6-42c3-98f7-58c644a03061\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878071 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/360a1093-b581-4806-9f88-3d3907bd4895-webhook-cert\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878100 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c36c8731-9ee6-4ce6-8708-9e35e6112804-config-volume\") pod \"dns-default-d6mxf\" (UID: \"c36c8731-9ee6-4ce6-8708-9e35e6112804\") " pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878133 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6k2g8\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878171 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/527ee9be-17be-4352-86fc-ef31bece3e86-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bbjwp\" (UID: \"527ee9be-17be-4352-86fc-ef31bece3e86\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878223 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxd8k\" (UniqueName: \"kubernetes.io/projected/28e27ee8-4574-4731-9324-031f9b3a209f-kube-api-access-kxd8k\") pod \"kube-storage-version-migrator-operator-b67b599dd-82b8f\" (UID: \"28e27ee8-4574-4731-9324-031f9b3a209f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878246 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/81a4453c-e1e8-4624-a19b-f08ec4df93d7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vmmzq\" (UID: \"81a4453c-e1e8-4624-a19b-f08ec4df93d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878269 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3768c453-c58d-4768-9620-a202cbb8ccd8-config-volume\") pod \"collect-profiles-29522235-lnbg8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878274 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878304 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxzwc\" (UniqueName: \"kubernetes.io/projected/4a5192d8-6708-48c6-b5e5-a081f89d3e66-kube-api-access-gxzwc\") pod \"service-ca-operator-777779d784-ttbrn\" (UID: \"4a5192d8-6708-48c6-b5e5-a081f89d3e66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878326 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcjhr\" (UniqueName: \"kubernetes.io/projected/2aaa28d2-1ca6-42c3-98f7-58c644a03061-kube-api-access-wcjhr\") pod \"package-server-manager-789f6589d5-6xkvs\" (UID: \"2aaa28d2-1ca6-42c3-98f7-58c644a03061\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878479 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-csi-data-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: E0217 13:27:49.878519 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.378508401 +0000 UTC m=+144.489927738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878981 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a6882836-eb39-412c-a0d6-4906c9be9b89-images\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.879345 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-socket-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.879408 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-mountpoint-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.880788 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a6882836-eb39-412c-a0d6-4906c9be9b89-auth-proxy-config\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.881401 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a5192d8-6708-48c6-b5e5-a081f89d3e66-serving-cert\") pod \"service-ca-operator-777779d784-ttbrn\" (UID: \"4a5192d8-6708-48c6-b5e5-a081f89d3e66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877632 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/360a1093-b581-4806-9f88-3d3907bd4895-tmpfs\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.881820 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-plugins-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.883069 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fd233b99-2205-4e95-ba04-232015517afb-srv-cert\") pod \"catalog-operator-68c6474976-645bx\" (UID: \"fd233b99-2205-4e95-ba04-232015517afb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.883610 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fd233b99-2205-4e95-ba04-232015517afb-profile-collector-cert\") pod \"catalog-operator-68c6474976-645bx\" (UID: \"fd233b99-2205-4e95-ba04-232015517afb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.883753 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1975682c-3445-467d-a0bd-a87b0ebf604b-node-bootstrap-token\") pod \"machine-config-server-ssf69\" (UID: \"1975682c-3445-467d-a0bd-a87b0ebf604b\") " pod="openshift-machine-config-operator/machine-config-server-ssf69" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.886189 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a5192d8-6708-48c6-b5e5-a081f89d3e66-config\") pod \"service-ca-operator-777779d784-ttbrn\" (UID: \"4a5192d8-6708-48c6-b5e5-a081f89d3e66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.886246 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6-signing-cabundle\") pod \"service-ca-9c57cc56f-5sp6x\" (UID: \"fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.886579 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c98dfab-f166-4eb4-b385-724d6b9b9d7a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-t4m4g\" (UID: \"6c98dfab-f166-4eb4-b385-724d6b9b9d7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.886740 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6882836-eb39-412c-a0d6-4906c9be9b89-proxy-tls\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.886968 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-registration-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.887323 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/81a4453c-e1e8-4624-a19b-f08ec4df93d7-srv-cert\") pod \"olm-operator-6b444d44fb-vmmzq\" (UID: \"81a4453c-e1e8-4624-a19b-f08ec4df93d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.887701 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e609565-a380-48f1-9b14-542a17c4ea50-cert\") pod \"ingress-canary-td8n5\" (UID: \"8e609565-a380-48f1-9b14-542a17c4ea50\") " pod="openshift-ingress-canary/ingress-canary-td8n5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.887866 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28e27ee8-4574-4731-9324-031f9b3a209f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-82b8f\" (UID: \"28e27ee8-4574-4731-9324-031f9b3a209f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.888645 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1975682c-3445-467d-a0bd-a87b0ebf604b-certs\") pod \"machine-config-server-ssf69\" (UID: \"1975682c-3445-467d-a0bd-a87b0ebf604b\") " pod="openshift-machine-config-operator/machine-config-server-ssf69" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.889063 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/81a4453c-e1e8-4624-a19b-f08ec4df93d7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vmmzq\" (UID: \"81a4453c-e1e8-4624-a19b-f08ec4df93d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.889128 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/360a1093-b581-4806-9f88-3d3907bd4895-apiservice-cert\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.889272 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3768c453-c58d-4768-9620-a202cbb8ccd8-secret-volume\") pod \"collect-profiles-29522235-lnbg8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.889619 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6k2g8\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.890558 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/527ee9be-17be-4352-86fc-ef31bece3e86-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bbjwp\" (UID: \"527ee9be-17be-4352-86fc-ef31bece3e86\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.891043 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2aaa28d2-1ca6-42c3-98f7-58c644a03061-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6xkvs\" (UID: \"2aaa28d2-1ca6-42c3-98f7-58c644a03061\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.894994 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/360a1093-b581-4806-9f88-3d3907bd4895-webhook-cert\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.895146 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6-signing-key\") pod \"service-ca-9c57cc56f-5sp6x\" (UID: \"fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.895750 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6k2g8\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.903235 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cde5d02-8e0d-4b24-b7bc-b9365013d942-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d5pqr\" (UID: \"7cde5d02-8e0d-4b24-b7bc-b9365013d942\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.905513 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.907275 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c36c8731-9ee6-4ce6-8708-9e35e6112804-config-volume\") pod \"dns-default-d6mxf\" (UID: \"c36c8731-9ee6-4ce6-8708-9e35e6112804\") " pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.908806 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c36c8731-9ee6-4ce6-8708-9e35e6112804-metrics-tls\") pod \"dns-default-d6mxf\" (UID: \"c36c8731-9ee6-4ce6-8708-9e35e6112804\") " pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.910948 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" event={"ID":"88e84359-a2f8-4d55-96e4-fda2ff226372","Type":"ContainerStarted","Data":"459928bdb43f6f1a1118edd680405432a2ec9199feeecf6f85fc24cb7d9f2210"} Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.911660 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e27ee8-4574-4731-9324-031f9b3a209f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-82b8f\" (UID: \"28e27ee8-4574-4731-9324-031f9b3a209f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.912226 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-46w22" event={"ID":"3ea797e4-54e0-4063-8d2b-647f6686e2a8","Type":"ContainerStarted","Data":"81d9af50a49b7a22054002410dd3f03b59f94ff18986b6c02f393dc6b67d21c6"} Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.912678 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm"] Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.914550 4804 patch_prober.go:28] interesting pod/downloads-7954f5f757-w4nl5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.914600 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-w4nl5" podUID="4c36b00a-bd3f-424c-a67b-d828d782e60f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.918544 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n95dt\" (UniqueName: \"kubernetes.io/projected/70a41b60-6ec1-491d-9d3e-88758d91c45e-kube-api-access-n95dt\") pod \"openshift-config-operator-7777fb866f-b8qc5\" (UID: \"70a41b60-6ec1-491d-9d3e-88758d91c45e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:49 crc kubenswrapper[4804]: W0217 13:27:49.921071 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96df7f4c_b782_43e2_99b2_fa5219a59fd9.slice/crio-337c4bb83e1a716779c975b834d44dccf972ba542a98cfad7e63b00502637c35 WatchSource:0}: Error finding container 337c4bb83e1a716779c975b834d44dccf972ba542a98cfad7e63b00502637c35: Status 404 returned error can't find the container with id 337c4bb83e1a716779c975b834d44dccf972ba542a98cfad7e63b00502637c35 Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.928452 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.939910 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqx9v\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-kube-api-access-cqx9v\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.958620 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-bound-sa-token\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.977968 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9400eb64-255c-46c2-b6c6-39260e013e92-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vjbls\" (UID: \"9400eb64-255c-46c2-b6c6-39260e013e92\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.979941 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:49 crc kubenswrapper[4804]: E0217 13:27:49.980294 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.48026567 +0000 UTC m=+144.591685017 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.980749 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: E0217 13:27:49.996123 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.49610287 +0000 UTC m=+144.607522207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.002106 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.003923 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x554\" (UniqueName: \"kubernetes.io/projected/1d929eaa-807c-4809-8b8a-78c186418e71-kube-api-access-8x554\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.020804 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfbln\" (UniqueName: \"kubernetes.io/projected/81f879fe-7bd1-42d0-b026-80f901641a0b-kube-api-access-vfbln\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.023408 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.046456 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.052822 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp6h2\" (UniqueName: \"kubernetes.io/projected/bd4df830-6ec9-4f4d-860e-f97af3088371-kube-api-access-pp6h2\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.053702 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.059506 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgt6h\" (UniqueName: \"kubernetes.io/projected/074c752f-fec1-4bd6-8773-596461ea288a-kube-api-access-rgt6h\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.066443 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.077042 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mcszv"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.085108 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.091333 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.092366 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.092970 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.592939425 +0000 UTC m=+144.704358762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.094035 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-spfls"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.100774 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.102142 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnm4h\" (UniqueName: \"kubernetes.io/projected/ea50fe9b-465a-448b-97db-a91822afb720-kube-api-access-qnm4h\") pod \"migrator-59844c95c7-q46rz\" (UID: \"ea50fe9b-465a-448b-97db-a91822afb720\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz" Feb 17 13:27:50 crc kubenswrapper[4804]: W0217 13:27:50.116909 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfb5c679_7c23_47fe_92b2_e035dceef1be.slice/crio-5b51801517ec57dbe5966aafa0af9b7b049eb2350ff2d09ce72342347190d8de WatchSource:0}: Error finding container 5b51801517ec57dbe5966aafa0af9b7b049eb2350ff2d09ce72342347190d8de: Status 404 returned error can't find the container with id 5b51801517ec57dbe5966aafa0af9b7b049eb2350ff2d09ce72342347190d8de Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.121971 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcjhr\" (UniqueName: \"kubernetes.io/projected/2aaa28d2-1ca6-42c3-98f7-58c644a03061-kube-api-access-wcjhr\") pod \"package-server-manager-789f6589d5-6xkvs\" (UID: \"2aaa28d2-1ca6-42c3-98f7-58c644a03061\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.138854 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.157189 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54d9r\" (UniqueName: \"kubernetes.io/projected/fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6-kube-api-access-54d9r\") pod \"service-ca-9c57cc56f-5sp6x\" (UID: \"fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.163350 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.166596 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxzwc\" (UniqueName: \"kubernetes.io/projected/4a5192d8-6708-48c6-b5e5-a081f89d3e66-kube-api-access-gxzwc\") pod \"service-ca-operator-777779d784-ttbrn\" (UID: \"4a5192d8-6708-48c6-b5e5-a081f89d3e66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.176938 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.186644 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx9tr\" (UniqueName: \"kubernetes.io/projected/6c98dfab-f166-4eb4-b385-724d6b9b9d7a-kube-api-access-kx9tr\") pod \"control-plane-machine-set-operator-78cbb6b69f-t4m4g\" (UID: \"6c98dfab-f166-4eb4-b385-724d6b9b9d7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.197288 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.197715 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.697704093 +0000 UTC m=+144.809123430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.205173 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7mtv\" (UniqueName: \"kubernetes.io/projected/a6882836-eb39-412c-a0d6-4906c9be9b89-kube-api-access-l7mtv\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.244219 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dzxb\" (UniqueName: \"kubernetes.io/projected/1975682c-3445-467d-a0bd-a87b0ebf604b-kube-api-access-4dzxb\") pod \"machine-config-server-ssf69\" (UID: \"1975682c-3445-467d-a0bd-a87b0ebf604b\") " pod="openshift-machine-config-operator/machine-config-server-ssf69" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.248308 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4gxw\" (UniqueName: \"kubernetes.io/projected/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-kube-api-access-x4gxw\") pod \"marketplace-operator-79b997595-6k2g8\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.251521 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.262028 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k8dn\" (UniqueName: \"kubernetes.io/projected/8e609565-a380-48f1-9b14-542a17c4ea50-kube-api-access-2k8dn\") pod \"ingress-canary-td8n5\" (UID: \"8e609565-a380-48f1-9b14-542a17c4ea50\") " pod="openshift-ingress-canary/ingress-canary-td8n5" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.274823 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.282892 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.298733 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.298965 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.798937214 +0000 UTC m=+144.910356551 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.299149 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5dmh\" (UniqueName: \"kubernetes.io/projected/81a4453c-e1e8-4624-a19b-f08ec4df93d7-kube-api-access-j5dmh\") pod \"olm-operator-6b444d44fb-vmmzq\" (UID: \"81a4453c-e1e8-4624-a19b-f08ec4df93d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.299224 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.299744 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.799735751 +0000 UTC m=+144.911155168 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.309262 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.312255 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khbwg\" (UniqueName: \"kubernetes.io/projected/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-kube-api-access-khbwg\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.314572 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.327353 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4s62\" (UniqueName: \"kubernetes.io/projected/fd233b99-2205-4e95-ba04-232015517afb-kube-api-access-g4s62\") pod \"catalog-operator-68c6474976-645bx\" (UID: \"fd233b99-2205-4e95-ba04-232015517afb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.360289 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z9lv\" (UniqueName: \"kubernetes.io/projected/3768c453-c58d-4768-9620-a202cbb8ccd8-kube-api-access-7z9lv\") pod \"collect-profiles-29522235-lnbg8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.364637 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spwnl\" (UniqueName: \"kubernetes.io/projected/360a1093-b581-4806-9f88-3d3907bd4895-kube-api-access-spwnl\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.371660 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.371901 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.384117 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h988v\" (UniqueName: \"kubernetes.io/projected/c36c8731-9ee6-4ce6-8708-9e35e6112804-kube-api-access-h988v\") pod \"dns-default-d6mxf\" (UID: \"c36c8731-9ee6-4ce6-8708-9e35e6112804\") " pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.385977 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-td8n5" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.401832 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.402226 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.902191323 +0000 UTC m=+145.013610660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.407699 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.407744 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-228st\" (UniqueName: \"kubernetes.io/projected/527ee9be-17be-4352-86fc-ef31bece3e86-kube-api-access-228st\") pod \"multus-admission-controller-857f4d67dd-bbjwp\" (UID: \"527ee9be-17be-4352-86fc-ef31bece3e86\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.412059 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.419449 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxd8k\" (UniqueName: \"kubernetes.io/projected/28e27ee8-4574-4731-9324-031f9b3a209f-kube-api-access-kxd8k\") pod \"kube-storage-version-migrator-operator-b67b599dd-82b8f\" (UID: \"28e27ee8-4574-4731-9324-031f9b3a209f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.424955 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-ssf69" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.426663 4804 csr.go:261] certificate signing request csr-pwcqm is approved, waiting to be issued Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.430253 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.434643 4804 csr.go:257] certificate signing request csr-pwcqm is issued Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.452494 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.502887 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j8ggj"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.505012 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.505476 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.005460343 +0000 UTC m=+145.116879680 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.528693 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.532617 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.546478 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.568523 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.599560 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.600494 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bstw9"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.605961 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.606102 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.106079763 +0000 UTC m=+145.217499100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.606287 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.606638 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.10662228 +0000 UTC m=+145.218041617 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.623164 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.623457 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:50 crc kubenswrapper[4804]: W0217 13:27:50.626535 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d929eaa_807c_4809_8b8a_78c186418e71.slice/crio-c622d293f5967334c96859bbffeac805786523250407581ac4cdc458a4cd4b45 WatchSource:0}: Error finding container c622d293f5967334c96859bbffeac805786523250407581ac4cdc458a4cd4b45: Status 404 returned error can't find the container with id c622d293f5967334c96859bbffeac805786523250407581ac4cdc458a4cd4b45 Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.654666 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.707419 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.707663 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.207638204 +0000 UTC m=+145.319057541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.715791 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.716667 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.216633585 +0000 UTC m=+145.328052922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.731727 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.770527 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.816999 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.817216 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.317169903 +0000 UTC m=+145.428589240 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.817340 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.817695 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.31768154 +0000 UTC m=+145.429100877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.836392 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs"] Feb 17 13:27:50 crc kubenswrapper[4804]: W0217 13:27:50.837933 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea50fe9b_465a_448b_97db_a91822afb720.slice/crio-72a7688c67fdc9df6acab36749b98a85a0d0d109fea872cb149058d0c9e7d1c9 WatchSource:0}: Error finding container 72a7688c67fdc9df6acab36749b98a85a0d0d109fea872cb149058d0c9e7d1c9: Status 404 returned error can't find the container with id 72a7688c67fdc9df6acab36749b98a85a0d0d109fea872cb149058d0c9e7d1c9 Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.840090 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-tz5vz" podStartSLOduration=123.84007423 podStartE2EDuration="2m3.84007423s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:50.83946289 +0000 UTC m=+144.950882227" watchObservedRunningTime="2026-02-17 13:27:50.84007423 +0000 UTC m=+144.951493567" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.877002 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" podStartSLOduration=123.876984647 podStartE2EDuration="2m3.876984647s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:50.874701341 +0000 UTC m=+144.986120678" watchObservedRunningTime="2026-02-17 13:27:50.876984647 +0000 UTC m=+144.988403984" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.918926 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.919086 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.419054316 +0000 UTC m=+145.530473663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.919231 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.919543 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.419531522 +0000 UTC m=+145.530950859 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.919975 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" podStartSLOduration=123.919958166 podStartE2EDuration="2m3.919958166s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:50.918313801 +0000 UTC m=+145.029733138" watchObservedRunningTime="2026-02-17 13:27:50.919958166 +0000 UTC m=+145.031377503" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.932321 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.954056 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" podStartSLOduration=123.954040797 podStartE2EDuration="2m3.954040797s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:50.953222181 +0000 UTC m=+145.064641518" watchObservedRunningTime="2026-02-17 13:27:50.954040797 +0000 UTC m=+145.065460134" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.965649 4804 generic.go:334] "Generic (PLEG): container finished" podID="3ea797e4-54e0-4063-8d2b-647f6686e2a8" containerID="69ade887fb4561f7461039cedf4c40001910b0d18b0de5daf1a6aeffb6f8d6d9" exitCode=0 Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.965837 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-46w22" event={"ID":"3ea797e4-54e0-4063-8d2b-647f6686e2a8","Type":"ContainerDied","Data":"69ade887fb4561f7461039cedf4c40001910b0d18b0de5daf1a6aeffb6f8d6d9"} Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.966613 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" event={"ID":"78dad77c-6d3f-43bc-93a3-ecd7dce378f3","Type":"ContainerStarted","Data":"0a96e1ef2bfcf8764be3660e10a30ae67e6eb64a806638e281a0fd209ce60dfc"} Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.967598 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-kbpk6" event={"ID":"074c752f-fec1-4bd6-8773-596461ea288a","Type":"ContainerStarted","Data":"f04934fadfb13f4a2b94d23f826ccbf2c11587f3079cfc04ee775c0340ba1584"} Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.968720 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" event={"ID":"70a41b60-6ec1-491d-9d3e-88758d91c45e","Type":"ContainerStarted","Data":"3d97fb8448b10f48b071b6a70d4f4f2987b70d4bf2286e1821fcf2cadb229b90"} Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.981547 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g"] Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.014979 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" podStartSLOduration=124.014965439 podStartE2EDuration="2m4.014965439s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:51.012558627 +0000 UTC m=+145.123977964" watchObservedRunningTime="2026-02-17 13:27:51.014965439 +0000 UTC m=+145.126384776" Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.020674 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.021140 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.521122394 +0000 UTC m=+145.632541731 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.029654 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" event={"ID":"17c8a131-fc0e-44b5-b374-846e6b2aeb1c","Type":"ContainerStarted","Data":"45a96212ff94af6d68214bf3f1edff552b90d12c42957570b833f1872469a96a"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.029696 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" event={"ID":"17c8a131-fc0e-44b5-b374-846e6b2aeb1c","Type":"ContainerStarted","Data":"2cb41ce0e1d66729234c40bc09930a98f8f2dbab8039e3fe7eb214142ec4274f"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.067531 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" event={"ID":"96df7f4c-b782-43e2-99b2-fa5219a59fd9","Type":"ContainerStarted","Data":"fd45a3b87ee7ec050a1e2226df399e2ca244c1384212608f1376c50fc62ba63e"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.067572 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" event={"ID":"96df7f4c-b782-43e2-99b2-fa5219a59fd9","Type":"ContainerStarted","Data":"337c4bb83e1a716779c975b834d44dccf972ba542a98cfad7e63b00502637c35"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.069571 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" event={"ID":"4c3cd53a-4a82-449d-a270-b41853fa2c8a","Type":"ContainerStarted","Data":"dcb906021bb914b0d35e536db8e77dce9a79a9b9c7a4d14ebe8fb578f4372c29"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.072487 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-mcszv" event={"ID":"bfb5c679-7c23-47fe-92b2-e035dceef1be","Type":"ContainerStarted","Data":"5b51801517ec57dbe5966aafa0af9b7b049eb2350ff2d09ce72342347190d8de"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.078289 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" event={"ID":"6f8789cf-f788-4c81-9624-532aa823de1c","Type":"ContainerStarted","Data":"a209eb83190d90d2fb6a3d22177034b2d0090e6a251f0ff17bf2d6cb44e252d6"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.083783 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" event={"ID":"faba1ad1-aeda-412d-9824-36cc045bab86","Type":"ContainerStarted","Data":"a4b6cbfefaf077ffe0f3e71671fde2907fe889b88fd4a0d27ee5e5b910c2832f"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.112144 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" event={"ID":"9400eb64-255c-46c2-b6c6-39260e013e92","Type":"ContainerStarted","Data":"1e0db7b9855a1df421980bae6f948a4fd3ebe623b23861e2e967766f9a6951c3"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.126058 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.127134 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.627117025 +0000 UTC m=+145.738536362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.136917 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" event={"ID":"7cde5d02-8e0d-4b24-b7bc-b9365013d942","Type":"ContainerStarted","Data":"4622fca4f4493e3824ba6757145476211ce7042560569e7102a91e59e70f017e"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.139797 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-ssf69" event={"ID":"1975682c-3445-467d-a0bd-a87b0ebf604b","Type":"ContainerStarted","Data":"fad0a6ed87fc56a684f2075982688a0f7f794c18f1763206315edfb485e10c3f"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.156123 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" event={"ID":"1d929eaa-807c-4809-8b8a-78c186418e71","Type":"ContainerStarted","Data":"c622d293f5967334c96859bbffeac805786523250407581ac4cdc458a4cd4b45"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.161036 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz" event={"ID":"ea50fe9b-465a-448b-97db-a91822afb720","Type":"ContainerStarted","Data":"72a7688c67fdc9df6acab36749b98a85a0d0d109fea872cb149058d0c9e7d1c9"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.163429 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" event={"ID":"81f879fe-7bd1-42d0-b026-80f901641a0b","Type":"ContainerStarted","Data":"71eeeb2236ea109e4995422167d6b6185d64b78a4f394944d8af1d30f1eaa147"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.164408 4804 patch_prober.go:28] interesting pod/downloads-7954f5f757-w4nl5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.164454 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-w4nl5" podUID="4c36b00a-bd3f-424c-a67b-d828d782e60f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.227299 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.227663 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.727636372 +0000 UTC m=+145.839055709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: W0217 13:27:51.320543 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a5192d8_6708_48c6_b5e5_a081f89d3e66.slice/crio-157f17bffbbd85a42fa16a2d7a38651d21a006c81233b8debee3b768edda376b WatchSource:0}: Error finding container 157f17bffbbd85a42fa16a2d7a38651d21a006c81233b8debee3b768edda376b: Status 404 returned error can't find the container with id 157f17bffbbd85a42fa16a2d7a38651d21a006c81233b8debee3b768edda376b Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.329802 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.330427 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.830409845 +0000 UTC m=+145.941829192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.431712 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.431881 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.931853903 +0000 UTC m=+146.043273250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.434910 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.435416 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-17 13:22:50 +0000 UTC, rotation deadline is 2027-01-10 04:50:41.090416388 +0000 UTC Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.435457 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7839h22m49.654962545s for next certificate rotation Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.436782 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.936739176 +0000 UTC m=+146.048158513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.447927 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5sp6x"] Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.554950 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh"] Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.555665 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.556092 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.056072323 +0000 UTC m=+146.167491660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.556800 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vwjsv"] Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.557016 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" podStartSLOduration=124.556996864 podStartE2EDuration="2m4.556996864s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:51.544660101 +0000 UTC m=+145.656079448" watchObservedRunningTime="2026-02-17 13:27:51.556996864 +0000 UTC m=+145.668416201" Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.566542 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-d6mxf"] Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.605161 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-w4nl5" podStartSLOduration=124.605143567 podStartE2EDuration="2m4.605143567s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:51.60194123 +0000 UTC m=+145.713360597" watchObservedRunningTime="2026-02-17 13:27:51.605143567 +0000 UTC m=+145.716562904" Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.658022 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.658661 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.158649499 +0000 UTC m=+146.270068836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.680631 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-td8n5"] Feb 17 13:27:51 crc kubenswrapper[4804]: W0217 13:27:51.690671 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd4df830_6ec9_4f4d_860e_f97af3088371.slice/crio-36a6f6266cb43ae1b4df1c4437d84aa74e1b0d87cb0df59fa8f5005efed37226 WatchSource:0}: Error finding container 36a6f6266cb43ae1b4df1c4437d84aa74e1b0d87cb0df59fa8f5005efed37226: Status 404 returned error can't find the container with id 36a6f6266cb43ae1b4df1c4437d84aa74e1b0d87cb0df59fa8f5005efed37226 Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.699687 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" podStartSLOduration=124.699668213 podStartE2EDuration="2m4.699668213s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:51.695131481 +0000 UTC m=+145.806550828" watchObservedRunningTime="2026-02-17 13:27:51.699668213 +0000 UTC m=+145.811087550" Feb 17 13:27:51 crc kubenswrapper[4804]: W0217 13:27:51.762110 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e609565_a380_48f1_9b14_542a17c4ea50.slice/crio-a5a45f7c4eccd84da9dcfe9858c4f610d2fd4675825f75daef98e8d3787a75e8 WatchSource:0}: Error finding container a5a45f7c4eccd84da9dcfe9858c4f610d2fd4675825f75daef98e8d3787a75e8: Status 404 returned error can't find the container with id a5a45f7c4eccd84da9dcfe9858c4f610d2fd4675825f75daef98e8d3787a75e8 Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.762229 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.262181077 +0000 UTC m=+146.373600414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.762134 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.762820 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.764649 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.264630399 +0000 UTC m=+146.376049736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.805358 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx"] Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.811122 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm"] Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.813947 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq"] Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.828675 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8"] Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.843958 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6k2g8"] Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.866551 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.866974 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.366955856 +0000 UTC m=+146.478375193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.870897 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bbjwp"] Feb 17 13:27:51 crc kubenswrapper[4804]: W0217 13:27:51.889676 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ce6eded_da13_4bb7_a87d_71b87d0e7f06.slice/crio-8d3bbbb9c8ddaebadf3050ba63a4409fb724b92775f2af121beab0c80c2020a4 WatchSource:0}: Error finding container 8d3bbbb9c8ddaebadf3050ba63a4409fb724b92775f2af121beab0c80c2020a4: Status 404 returned error can't find the container with id 8d3bbbb9c8ddaebadf3050ba63a4409fb724b92775f2af121beab0c80c2020a4 Feb 17 13:27:51 crc kubenswrapper[4804]: W0217 13:27:51.890396 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6882836_eb39_412c_a0d6_4906c9be9b89.slice/crio-f8106f5d88fc33d09cd9b635c7548d409081e8d0575eabfd1196bb8ee25c8879 WatchSource:0}: Error finding container f8106f5d88fc33d09cd9b635c7548d409081e8d0575eabfd1196bb8ee25c8879: Status 404 returned error can't find the container with id f8106f5d88fc33d09cd9b635c7548d409081e8d0575eabfd1196bb8ee25c8879 Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.906191 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f"] Feb 17 13:27:51 crc kubenswrapper[4804]: W0217 13:27:51.931142 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd233b99_2205_4e95_ba04_232015517afb.slice/crio-c4ca83a20b333f19f7d43356e38f804b985ba421896d923dfc31a2eb98a1fdcd WatchSource:0}: Error finding container c4ca83a20b333f19f7d43356e38f804b985ba421896d923dfc31a2eb98a1fdcd: Status 404 returned error can't find the container with id c4ca83a20b333f19f7d43356e38f804b985ba421896d923dfc31a2eb98a1fdcd Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.964399 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq"] Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.968022 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.968362 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.468350743 +0000 UTC m=+146.579770080 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.996657 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" podStartSLOduration=123.99664103 podStartE2EDuration="2m3.99664103s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:51.995626986 +0000 UTC m=+146.107046323" watchObservedRunningTime="2026-02-17 13:27:51.99664103 +0000 UTC m=+146.108060367" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.055024 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" podStartSLOduration=125.054965234 podStartE2EDuration="2m5.054965234s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.037708766 +0000 UTC m=+146.149128103" watchObservedRunningTime="2026-02-17 13:27:52.054965234 +0000 UTC m=+146.166384571" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.069149 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.069319 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.569292974 +0000 UTC m=+146.680712311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.071904 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.072410 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.572398968 +0000 UTC m=+146.683818305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.172971 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.173252 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.673179104 +0000 UTC m=+146.784598441 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.173658 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.174091 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.674078794 +0000 UTC m=+146.785498131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.174769 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" event={"ID":"6f8789cf-f788-4c81-9624-532aa823de1c","Type":"ContainerStarted","Data":"35c48e8fb2fa81f91413e79f65510c475665d951d8e3729fdc7ec2652d35c229"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.182585 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz" event={"ID":"ea50fe9b-465a-448b-97db-a91822afb720","Type":"ContainerStarted","Data":"f43a05989a0cb018994c7ed93f6fcbb3287c333caa1e224a9b8ad854ef507a79"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.186056 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" event={"ID":"faba1ad1-aeda-412d-9824-36cc045bab86","Type":"ContainerStarted","Data":"d31353ab3fe48fbdb124d235032b5df5328407038b987a4788426c871ad7a301"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.186448 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.193666 4804 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-mqkcq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.193716 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" podUID="faba1ad1-aeda-412d-9824-36cc045bab86" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.204368 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" podStartSLOduration=124.204355288 podStartE2EDuration="2m4.204355288s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.200701165 +0000 UTC m=+146.312120502" watchObservedRunningTime="2026-02-17 13:27:52.204355288 +0000 UTC m=+146.315774625" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.235676 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" event={"ID":"2ce6eded-da13-4bb7-a87d-71b87d0e7f06","Type":"ContainerStarted","Data":"8d3bbbb9c8ddaebadf3050ba63a4409fb724b92775f2af121beab0c80c2020a4"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.272560 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" event={"ID":"fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6","Type":"ContainerStarted","Data":"d09f27464dd0852c3eb4f37afc58c154c9fbb7700f52b9305aaf919abcafbf4a"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.274318 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.274379 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-ssf69" event={"ID":"1975682c-3445-467d-a0bd-a87b0ebf604b","Type":"ContainerStarted","Data":"880ed9d5f01451b75d2dc6ed95deb43e16cb8367642cbf6aeea953cb0fd0e13c"} Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.274528 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.774500308 +0000 UTC m=+146.885919695 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.274858 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.275486 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.77546926 +0000 UTC m=+146.886888597 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.288889 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" event={"ID":"70a41b60-6ec1-491d-9d3e-88758d91c45e","Type":"ContainerStarted","Data":"691b4b9bb1ec8153713740adfbef24a2316c8f6246fc43ff1dca064d36af3efd"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.295031 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.296027 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.297302 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-ssf69" podStartSLOduration=5.297291411 podStartE2EDuration="5.297291411s" podCreationTimestamp="2026-02-17 13:27:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.295463039 +0000 UTC m=+146.406882376" watchObservedRunningTime="2026-02-17 13:27:52.297291411 +0000 UTC m=+146.408710748" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.307609 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" event={"ID":"4c3cd53a-4a82-449d-a270-b41853fa2c8a","Type":"ContainerStarted","Data":"277f5e06890a3b0a429ab21613a9d4cdb62546cc0ce8158d53f54e0de8d34994"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.317360 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" event={"ID":"a6882836-eb39-412c-a0d6-4906c9be9b89","Type":"ContainerStarted","Data":"f8106f5d88fc33d09cd9b635c7548d409081e8d0575eabfd1196bb8ee25c8879"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.320010 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" event={"ID":"3768c453-c58d-4768-9620-a202cbb8ccd8","Type":"ContainerStarted","Data":"76ca0c3a1f23c1bfd5829400e9cd39546fdd21f9b110e5b71897bf1278603129"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.320554 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.325046 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" event={"ID":"78dad77c-6d3f-43bc-93a3-ecd7dce378f3","Type":"ContainerStarted","Data":"ab20668d8bb8760f6b156b43021f78b8c090ccfac11a6030ff207b369f1b77ce"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.326504 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" event={"ID":"bd4df830-6ec9-4f4d-860e-f97af3088371","Type":"ContainerStarted","Data":"36a6f6266cb43ae1b4df1c4437d84aa74e1b0d87cb0df59fa8f5005efed37226"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.328602 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" event={"ID":"2aaa28d2-1ca6-42c3-98f7-58c644a03061","Type":"ContainerStarted","Data":"98fc7dcdd6e644150fd44190a09bb7717d7abbaa177adba025a2f687c8e15714"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.328630 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" event={"ID":"2aaa28d2-1ca6-42c3-98f7-58c644a03061","Type":"ContainerStarted","Data":"347362c67773c71448c8813eb3a8b6fe9bbfca98f9e69e54c49cdc2ff253fd7e"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.330452 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" event={"ID":"96df7f4c-b782-43e2-99b2-fa5219a59fd9","Type":"ContainerStarted","Data":"371959c237c07729c0ef1bbe1f63b6e5bc67fdacdfa6b555347a4ef23550900d"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.332079 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-kbpk6" event={"ID":"074c752f-fec1-4bd6-8773-596461ea288a","Type":"ContainerStarted","Data":"8ef76cd8399e73581929a3003d22c9543d350149ae189e6a2a726e30aa4305b4"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.361620 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" event={"ID":"17c8a131-fc0e-44b5-b374-846e6b2aeb1c","Type":"ContainerStarted","Data":"80d210a2a3c63131fb8282f24532253bbe1e87464049f2ccc07402908932ed1e"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.364730 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" event={"ID":"28e27ee8-4574-4731-9324-031f9b3a209f","Type":"ContainerStarted","Data":"8a4c062f9db2dace0be040b4679b42ec47596225d8d9ba0a189393b5a3eab071"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.367145 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" event={"ID":"527ee9be-17be-4352-86fc-ef31bece3e86","Type":"ContainerStarted","Data":"752275f6e311d6fa52f9ec458f3dd3d978e12c3b722feb4df6e38efa3fb4bed2"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.368669 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" event={"ID":"fd882ba0-9d9f-4a38-8a48-ab4d146fff56","Type":"ContainerStarted","Data":"b981933933cb97d97cf932eb8b0d74b01f46746f21c79b9a7996eaaaeb1edc53"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.370971 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" podStartSLOduration=125.370953368 podStartE2EDuration="2m5.370953368s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.341965968 +0000 UTC m=+146.453385305" watchObservedRunningTime="2026-02-17 13:27:52.370953368 +0000 UTC m=+146.482372705" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.375120 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" podStartSLOduration=125.375101737 podStartE2EDuration="2m5.375101737s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.370066879 +0000 UTC m=+146.481486216" watchObservedRunningTime="2026-02-17 13:27:52.375101737 +0000 UTC m=+146.486521064" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.375812 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.377449 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.877420185 +0000 UTC m=+146.988839522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.393059 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-td8n5" event={"ID":"8e609565-a380-48f1-9b14-542a17c4ea50","Type":"ContainerStarted","Data":"a5a45f7c4eccd84da9dcfe9858c4f610d2fd4675825f75daef98e8d3787a75e8"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.436470 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" podStartSLOduration=125.436452602 podStartE2EDuration="2m5.436452602s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.434183656 +0000 UTC m=+146.545602993" watchObservedRunningTime="2026-02-17 13:27:52.436452602 +0000 UTC m=+146.547871939" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.447004 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" event={"ID":"81a4453c-e1e8-4624-a19b-f08ec4df93d7","Type":"ContainerStarted","Data":"dddf129af9a00ddc4f0969d0e5a291dc33022c3ebe91aea664d9b86e31058b0d"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.467974 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-kbpk6" podStartSLOduration=125.467951258 podStartE2EDuration="2m5.467951258s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.459779063 +0000 UTC m=+146.571198400" watchObservedRunningTime="2026-02-17 13:27:52.467951258 +0000 UTC m=+146.579370595" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.477981 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.479628 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.979613777 +0000 UTC m=+147.091033114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.483880 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" event={"ID":"4a5192d8-6708-48c6-b5e5-a081f89d3e66","Type":"ContainerStarted","Data":"61a388e9adc9a943377d688d27c2dc81dfad669670c5a3bf1b1a23df23c9b059"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.484016 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" event={"ID":"4a5192d8-6708-48c6-b5e5-a081f89d3e66","Type":"ContainerStarted","Data":"157f17bffbbd85a42fa16a2d7a38651d21a006c81233b8debee3b768edda376b"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.486491 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d6mxf" event={"ID":"c36c8731-9ee6-4ce6-8708-9e35e6112804","Type":"ContainerStarted","Data":"b2a2a565951f1940fab3a2e856fad4543924e55909a40759f9b387aadb12721f"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.490133 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" event={"ID":"fd233b99-2205-4e95-ba04-232015517afb","Type":"ContainerStarted","Data":"c4ca83a20b333f19f7d43356e38f804b985ba421896d923dfc31a2eb98a1fdcd"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.493289 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" event={"ID":"7cde5d02-8e0d-4b24-b7bc-b9365013d942","Type":"ContainerStarted","Data":"c924f7a2b708775eb6e228afa4b84edbaca3d7ee7f383bfe914fd277a0572a48"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.495581 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" event={"ID":"1d929eaa-807c-4809-8b8a-78c186418e71","Type":"ContainerStarted","Data":"e85184210391718c97e4b64df6d5ddb787255a643b112757a5bd89ac1f1c1ad2"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.496540 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.501553 4804 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-j8ggj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.501611 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" podUID="1d929eaa-807c-4809-8b8a-78c186418e71" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.516657 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" podStartSLOduration=124.516630698 podStartE2EDuration="2m4.516630698s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.50654076 +0000 UTC m=+146.617960097" watchObservedRunningTime="2026-02-17 13:27:52.516630698 +0000 UTC m=+146.628050035" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.521451 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g" event={"ID":"6c98dfab-f166-4eb4-b385-724d6b9b9d7a","Type":"ContainerStarted","Data":"3763f04c0b5a11e4d2f859d573b3f3722479804f6fd8c70f1af703155e237371"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.521496 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g" event={"ID":"6c98dfab-f166-4eb4-b385-724d6b9b9d7a","Type":"ContainerStarted","Data":"3d0d40e975c4ac627b43e84809a03150f409bc6614f8fb7f3a983ec339ab9823"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.532184 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" event={"ID":"81f879fe-7bd1-42d0-b026-80f901641a0b","Type":"ContainerStarted","Data":"50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.532277 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.532951 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" podStartSLOduration=125.532932743 podStartE2EDuration="2m5.532932743s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.530788832 +0000 UTC m=+146.642208169" watchObservedRunningTime="2026-02-17 13:27:52.532932743 +0000 UTC m=+146.644352070" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.536657 4804 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-bstw9 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" start-of-body= Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.536707 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.540768 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-mcszv" event={"ID":"bfb5c679-7c23-47fe-92b2-e035dceef1be","Type":"ContainerStarted","Data":"ddf59f8040b668f0f4c58d8bef4c204dcd3e4b466336f43e6ee9b0870efcec50"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.541532 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.542567 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" event={"ID":"360a1093-b581-4806-9f88-3d3907bd4895","Type":"ContainerStarted","Data":"30624883332b7603a32f0b5c7350e0ff499a85473ddd4fbdc16ed765cd6f36f8"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.543294 4804 patch_prober.go:28] interesting pod/console-operator-58897d9998-mcszv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.543334 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-mcszv" podUID="bfb5c679-7c23-47fe-92b2-e035dceef1be" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.549863 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.584576 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.585555 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:53.085540856 +0000 UTC m=+147.196960193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.585720 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" podStartSLOduration=125.585702651 podStartE2EDuration="2m5.585702651s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.557533628 +0000 UTC m=+146.668952965" watchObservedRunningTime="2026-02-17 13:27:52.585702651 +0000 UTC m=+146.697121988" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.585953 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" podStartSLOduration=124.585947559 podStartE2EDuration="2m4.585947559s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.584064066 +0000 UTC m=+146.695483403" watchObservedRunningTime="2026-02-17 13:27:52.585947559 +0000 UTC m=+146.697366896" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.685954 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g" podStartSLOduration=124.685930909 podStartE2EDuration="2m4.685930909s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.667493271 +0000 UTC m=+146.778912608" watchObservedRunningTime="2026-02-17 13:27:52.685930909 +0000 UTC m=+146.797350246" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.686685 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.689643 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:53.189605181 +0000 UTC m=+147.301024518 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.715639 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" podStartSLOduration=125.715620883 podStartE2EDuration="2m5.715620883s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.715261911 +0000 UTC m=+146.826681258" watchObservedRunningTime="2026-02-17 13:27:52.715620883 +0000 UTC m=+146.827040230" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.747916 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-mcszv" podStartSLOduration=125.747896805 podStartE2EDuration="2m5.747896805s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.741181919 +0000 UTC m=+146.852601256" watchObservedRunningTime="2026-02-17 13:27:52.747896805 +0000 UTC m=+146.859316142" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.788340 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.788783 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:53.288765943 +0000 UTC m=+147.400185280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.889514 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.889832 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:53.389819278 +0000 UTC m=+147.501238615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.990285 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.990457 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:53.490422238 +0000 UTC m=+147.601841575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.990569 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.990979 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:53.490968116 +0000 UTC m=+147.602387533 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.068178 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.071277 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:27:53 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:27:53 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:27:53 crc kubenswrapper[4804]: healthz check failed Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.071332 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.091679 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.092259 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:53.592236868 +0000 UTC m=+147.703656235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.193638 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.194092 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:53.694069989 +0000 UTC m=+147.805489396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.294725 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.295180 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:53.795161475 +0000 UTC m=+147.906580812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.400497 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.400889 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:53.900868436 +0000 UTC m=+148.012287773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.505775 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.506259 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.006217595 +0000 UTC m=+148.117636932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.506433 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.506900 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.006890637 +0000 UTC m=+148.118309974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.549614 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" event={"ID":"bd4df830-6ec9-4f4d-860e-f97af3088371","Type":"ContainerStarted","Data":"2a9ec8b1b537fc993084160d692e82780a76bdb11a34d891cccbf3f8dac45031"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.552017 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" event={"ID":"81a4453c-e1e8-4624-a19b-f08ec4df93d7","Type":"ContainerStarted","Data":"8dfcf289f6b49c93fb0be9c6c8194c60cc7412daf6fe6ff845b21d0f07db7852"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.552213 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.553864 4804 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-vmmzq container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.553903 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" podUID="81a4453c-e1e8-4624-a19b-f08ec4df93d7" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.555141 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-46w22" event={"ID":"3ea797e4-54e0-4063-8d2b-647f6686e2a8","Type":"ContainerStarted","Data":"79a0541efdfdf50b3866ac8b0b6206b325d36bec38c322976e1c15ffbaf6838f"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.555168 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-46w22" event={"ID":"3ea797e4-54e0-4063-8d2b-647f6686e2a8","Type":"ContainerStarted","Data":"a23b15fab9f56fb7e41335504d13cb65034f5191177ba08a44ab7be11ffffa97"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.560022 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-td8n5" event={"ID":"8e609565-a380-48f1-9b14-542a17c4ea50","Type":"ContainerStarted","Data":"ca7c5a82ceeba6e17d07d04b4b8b17de5363d890a0ac2ed1cd83b59d39695391"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.563937 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" event={"ID":"9400eb64-255c-46c2-b6c6-39260e013e92","Type":"ContainerStarted","Data":"acb53343e24b64214eb8d63b479506b99fb10c57ac8fe080ef42bcca2d89b04d"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.565654 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" event={"ID":"360a1093-b581-4806-9f88-3d3907bd4895","Type":"ContainerStarted","Data":"2782ad929923ff294638d4cfc8dab0936f539b2948e17124ffd7769c4d9020c9"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.566013 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.567048 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" event={"ID":"fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6","Type":"ContainerStarted","Data":"56c53ed4105db510e93c67b057053181df6ebbb6d8541ab870a20fa4dc8300ce"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.567286 4804 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-9d9jq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.567324 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" podUID="360a1093-b581-4806-9f88-3d3907bd4895" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.569316 4804 generic.go:334] "Generic (PLEG): container finished" podID="70a41b60-6ec1-491d-9d3e-88758d91c45e" containerID="691b4b9bb1ec8153713740adfbef24a2316c8f6246fc43ff1dca064d36af3efd" exitCode=0 Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.569362 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" event={"ID":"70a41b60-6ec1-491d-9d3e-88758d91c45e","Type":"ContainerDied","Data":"691b4b9bb1ec8153713740adfbef24a2316c8f6246fc43ff1dca064d36af3efd"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.569399 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" event={"ID":"70a41b60-6ec1-491d-9d3e-88758d91c45e","Type":"ContainerStarted","Data":"7355a035aa7a473d1e3a624201cd7f56eee3e639ccdb8eaad53592345713c9d4"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.569581 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.570408 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d6mxf" event={"ID":"c36c8731-9ee6-4ce6-8708-9e35e6112804","Type":"ContainerStarted","Data":"9daa60af0a56693576a356277be4e8c6d9f194e128d21b17f28caa1465248b95"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.574059 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" event={"ID":"3768c453-c58d-4768-9620-a202cbb8ccd8","Type":"ContainerStarted","Data":"4162bfeb135a23379531aee533539dfb67782c33b11814b5fe4b4ead4443c227"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.576020 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" event={"ID":"2aaa28d2-1ca6-42c3-98f7-58c644a03061","Type":"ContainerStarted","Data":"b9519c5563ed579e5f6bcfbd075913b2603d9e5eee582aaa188ea4e5e16e7df8"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.576646 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.585027 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" event={"ID":"2ce6eded-da13-4bb7-a87d-71b87d0e7f06","Type":"ContainerStarted","Data":"249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.585927 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.587142 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz" event={"ID":"ea50fe9b-465a-448b-97db-a91822afb720","Type":"ContainerStarted","Data":"8bd913f3da6cc61714dd2cb60137b6fcbbd0af3fcff7addf3f4ccc60444dc47f"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.587814 4804 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6k2g8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.587844 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" podUID="2ce6eded-da13-4bb7-a87d-71b87d0e7f06" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.588825 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" event={"ID":"28e27ee8-4574-4731-9324-031f9b3a209f","Type":"ContainerStarted","Data":"46d82360d36ed3e50bb06f3f1654ff43325603c6cf8e180cc5b5ae8beaa2dcf0"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.590122 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" event={"ID":"527ee9be-17be-4352-86fc-ef31bece3e86","Type":"ContainerStarted","Data":"6ae1bb948ecea1ad7a23965654328918240b275b6050fcce0a3757ab066fb634"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.591109 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" event={"ID":"a6882836-eb39-412c-a0d6-4906c9be9b89","Type":"ContainerStarted","Data":"c4015993bceb7bee92d209c40fe142b66dc9cece4f5b4d1e69297b741f615cb7"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.593481 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" event={"ID":"fd233b99-2205-4e95-ba04-232015517afb","Type":"ContainerStarted","Data":"206ebf3fd6fd37435daf25f2fe623fe2cd8a8e6c9a6b6697d1335963dc1111f1"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.593512 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.594140 4804 patch_prober.go:28] interesting pod/console-operator-58897d9998-mcszv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.594185 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-mcszv" podUID="bfb5c679-7c23-47fe-92b2-e035dceef1be" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.594279 4804 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-bstw9 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" start-of-body= Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.594340 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.595030 4804 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-j8ggj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.595049 4804 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-645bx container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.595071 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" podUID="1d929eaa-807c-4809-8b8a-78c186418e71" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.595087 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" podUID="fd233b99-2205-4e95-ba04-232015517afb" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.595679 4804 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-mqkcq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.595712 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" podUID="faba1ad1-aeda-412d-9824-36cc045bab86" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.605061 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" podStartSLOduration=125.605047885 podStartE2EDuration="2m5.605047885s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.583531604 +0000 UTC m=+147.694950941" watchObservedRunningTime="2026-02-17 13:27:53.605047885 +0000 UTC m=+147.716467222" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.607297 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" podStartSLOduration=125.60728992 podStartE2EDuration="2m5.60728992s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.604571609 +0000 UTC m=+147.715990946" watchObservedRunningTime="2026-02-17 13:27:53.60728992 +0000 UTC m=+147.718709257" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.607732 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.607898 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.107880849 +0000 UTC m=+148.219300186 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.607946 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.609399 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.109391211 +0000 UTC m=+148.220810548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.629404 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" podStartSLOduration=125.629386301 podStartE2EDuration="2m5.629386301s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.628683467 +0000 UTC m=+147.740102804" watchObservedRunningTime="2026-02-17 13:27:53.629386301 +0000 UTC m=+147.740805638" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.649494 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" podStartSLOduration=126.649478684 podStartE2EDuration="2m6.649478684s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.647521337 +0000 UTC m=+147.758940674" watchObservedRunningTime="2026-02-17 13:27:53.649478684 +0000 UTC m=+147.760898021" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.671588 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" podStartSLOduration=126.671569644 podStartE2EDuration="2m6.671569644s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.668880293 +0000 UTC m=+147.780299630" watchObservedRunningTime="2026-02-17 13:27:53.671569644 +0000 UTC m=+147.782988981" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.686050 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" podStartSLOduration=125.686031787 podStartE2EDuration="2m5.686031787s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.685103387 +0000 UTC m=+147.796522724" watchObservedRunningTime="2026-02-17 13:27:53.686031787 +0000 UTC m=+147.797451124" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.705935 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-46w22" podStartSLOduration=126.705921783 podStartE2EDuration="2m6.705921783s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.70402247 +0000 UTC m=+147.815441807" watchObservedRunningTime="2026-02-17 13:27:53.705921783 +0000 UTC m=+147.817341120" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.709237 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.709625 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.209583516 +0000 UTC m=+148.321002913 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.712815 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.716108 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.216099795 +0000 UTC m=+148.327519132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.725994 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-td8n5" podStartSLOduration=7.725977625 podStartE2EDuration="7.725977625s" podCreationTimestamp="2026-02-17 13:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.72523287 +0000 UTC m=+147.836652237" watchObservedRunningTime="2026-02-17 13:27:53.725977625 +0000 UTC m=+147.837396962" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.777165 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" podStartSLOduration=125.777142709 podStartE2EDuration="2m5.777142709s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.749240265 +0000 UTC m=+147.860659612" watchObservedRunningTime="2026-02-17 13:27:53.777142709 +0000 UTC m=+147.888562046" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.779742 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" podStartSLOduration=125.779732716 podStartE2EDuration="2m5.779732716s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.776449446 +0000 UTC m=+147.887868783" watchObservedRunningTime="2026-02-17 13:27:53.779732716 +0000 UTC m=+147.891152053" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.795477 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz" podStartSLOduration=125.795460923 podStartE2EDuration="2m5.795460923s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.793722825 +0000 UTC m=+147.905142162" watchObservedRunningTime="2026-02-17 13:27:53.795460923 +0000 UTC m=+147.906880260" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.810533 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" podStartSLOduration=125.810495857 podStartE2EDuration="2m5.810495857s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.808361795 +0000 UTC m=+147.919781132" watchObservedRunningTime="2026-02-17 13:27:53.810495857 +0000 UTC m=+147.921915194" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.813933 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.814316 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.314301394 +0000 UTC m=+148.425720731 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.827010 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" podStartSLOduration=125.826992449 podStartE2EDuration="2m5.826992449s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.824391212 +0000 UTC m=+147.935810549" watchObservedRunningTime="2026-02-17 13:27:53.826992449 +0000 UTC m=+147.938411786" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.915287 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.915751 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.415732251 +0000 UTC m=+148.527151648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.016739 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.016959 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.516925121 +0000 UTC m=+148.628344458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.018105 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.018440 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.518429472 +0000 UTC m=+148.629848909 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.070175 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:27:54 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:27:54 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:27:54 crc kubenswrapper[4804]: healthz check failed Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.070256 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.118962 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.119186 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.619147525 +0000 UTC m=+148.730566872 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.119328 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.119769 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.619757536 +0000 UTC m=+148.731176893 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.205557 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.206178 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.206419 4804 patch_prober.go:28] interesting pod/apiserver-76f77b778f-46w22 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.18:8443/livez\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.206461 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-46w22" podUID="3ea797e4-54e0-4063-8d2b-647f6686e2a8" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.18:8443/livez\": dial tcp 10.217.0.18:8443: connect: connection refused" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.220683 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.220895 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.720878942 +0000 UTC m=+148.832298279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.221058 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.221433 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.721423751 +0000 UTC m=+148.832843088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.323005 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.323069 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.823050945 +0000 UTC m=+148.934470282 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.323382 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.323650 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.823642905 +0000 UTC m=+148.935062242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.424777 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.424885 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.924863886 +0000 UTC m=+149.036283223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.425003 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.425361 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.925353331 +0000 UTC m=+149.036772668 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.526123 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.526512 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.026497799 +0000 UTC m=+149.137917136 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.597826 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" event={"ID":"527ee9be-17be-4352-86fc-ef31bece3e86","Type":"ContainerStarted","Data":"0528d846578dc0dbefe2cbed846ca3fcbf47ebf9286a31e312c4ccdb2764afc8"} Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.600015 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d6mxf" event={"ID":"c36c8731-9ee6-4ce6-8708-9e35e6112804","Type":"ContainerStarted","Data":"3edbca92b41e3bc0a5efddec609eaa025b38796ec47c30a71c9349c4082125a6"} Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.600161 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.601864 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" event={"ID":"a6882836-eb39-412c-a0d6-4906c9be9b89","Type":"ContainerStarted","Data":"0a375fd15e41373a403c853831490ec34e3e6b25fca0f36c84d69bc96f1e8ceb"} Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.604449 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" event={"ID":"bd4df830-6ec9-4f4d-860e-f97af3088371","Type":"ContainerStarted","Data":"4a7592febf5222b4c6dcee1635095b02462340091529fb375dca19c64fff324e"} Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.605701 4804 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-645bx container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.605735 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" podUID="fd233b99-2205-4e95-ba04-232015517afb" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.606971 4804 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-vmmzq container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.607002 4804 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-j8ggj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.607033 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" podUID="1d929eaa-807c-4809-8b8a-78c186418e71" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.607047 4804 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6k2g8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.607092 4804 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-9d9jq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.607033 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" podUID="81a4453c-e1e8-4624-a19b-f08ec4df93d7" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.607114 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" podUID="360a1093-b581-4806-9f88-3d3907bd4895" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.607095 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" podUID="2ce6eded-da13-4bb7-a87d-71b87d0e7f06" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.627948 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.628321 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.12830468 +0000 UTC m=+149.239724017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.646249 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" podStartSLOduration=126.64622862 podStartE2EDuration="2m6.64622862s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:54.619601968 +0000 UTC m=+148.731021305" watchObservedRunningTime="2026-02-17 13:27:54.64622862 +0000 UTC m=+148.757647947" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.646552 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" podStartSLOduration=127.64654649 podStartE2EDuration="2m7.64654649s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:54.643274012 +0000 UTC m=+148.754693349" watchObservedRunningTime="2026-02-17 13:27:54.64654649 +0000 UTC m=+148.757965837" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.718660 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" podStartSLOduration=126.718641596 podStartE2EDuration="2m6.718641596s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:54.676550996 +0000 UTC m=+148.787970333" watchObservedRunningTime="2026-02-17 13:27:54.718641596 +0000 UTC m=+148.830060933" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.720945 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-d6mxf" podStartSLOduration=7.720933913 podStartE2EDuration="7.720933913s" podCreationTimestamp="2026-02-17 13:27:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:54.7167044 +0000 UTC m=+148.828123737" watchObservedRunningTime="2026-02-17 13:27:54.720933913 +0000 UTC m=+148.832353250" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.728809 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.728994 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.228950241 +0000 UTC m=+149.340369578 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.731380 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.735182 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.235166439 +0000 UTC m=+149.346585776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.833069 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.833321 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.333290816 +0000 UTC m=+149.444710153 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.833517 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.833897 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.333882965 +0000 UTC m=+149.445302302 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.934188 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.934338 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.43431301 +0000 UTC m=+149.545732347 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.934442 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.934727 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.434713633 +0000 UTC m=+149.546132960 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.034956 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.035116 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.535085535 +0000 UTC m=+149.646504872 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.036522 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.036821 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.536808883 +0000 UTC m=+149.648228220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.071795 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:27:55 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:27:55 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:27:55 crc kubenswrapper[4804]: healthz check failed Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.071868 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.138115 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.138339 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.638302243 +0000 UTC m=+149.749721920 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.138401 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.138729 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.638713347 +0000 UTC m=+149.750132744 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.239827 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.240258 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.740227456 +0000 UTC m=+149.851646793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.342086 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.342454 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.84243926 +0000 UTC m=+149.953858597 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.443695 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.443874 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.943839607 +0000 UTC m=+150.055258944 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.443926 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.443968 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.443997 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.444016 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.444048 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.444552 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.9445449 +0000 UTC m=+150.055964237 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.445044 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.454947 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.462812 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.478572 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.544758 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.544961 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.044931622 +0000 UTC m=+150.156350949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.545046 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.545411 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.045394688 +0000 UTC m=+150.156814025 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.606270 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.620028 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.621971 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" event={"ID":"fd882ba0-9d9f-4a38-8a48-ab4d146fff56","Type":"ContainerStarted","Data":"99e6fa30f665a16ca4fdadb770e80b0a0dce06c141f18d2af61b8e64abe50477"} Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.622988 4804 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6k2g8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.623036 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" podUID="2ce6eded-da13-4bb7-a87d-71b87d0e7f06" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.640143 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.640570 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.648126 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.648537 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.148520942 +0000 UTC m=+150.259940279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.750642 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.752264 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.252243906 +0000 UTC m=+150.363663273 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.836738 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.836795 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.852241 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.852614 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.352580638 +0000 UTC m=+150.463999985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.852697 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.853088 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.353076724 +0000 UTC m=+150.464496111 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.953446 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.953653 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.453623012 +0000 UTC m=+150.565042349 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.953780 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.954165 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.454149839 +0000 UTC m=+150.565569176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.057931 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.058258 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.558239546 +0000 UTC m=+150.669658883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.062934 4804 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-b8qc5 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.063301 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" podUID="70a41b60-6ec1-491d-9d3e-88758d91c45e" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.063660 4804 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-b8qc5 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.063686 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" podUID="70a41b60-6ec1-491d-9d3e-88758d91c45e" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.073578 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:27:56 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:27:56 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:27:56 crc kubenswrapper[4804]: healthz check failed Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.073643 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.160155 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.160538 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.660520533 +0000 UTC m=+150.771939860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.260635 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.260905 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.760891464 +0000 UTC m=+150.872310801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.361873 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.362217 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.862190468 +0000 UTC m=+150.973609795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.462546 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.462741 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.962712595 +0000 UTC m=+151.074131932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.462989 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.463328 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.963316824 +0000 UTC m=+151.074736161 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.563991 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.564351 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.064336368 +0000 UTC m=+151.175755695 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.624234 4804 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-9d9jq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.624284 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" podUID="360a1093-b581-4806-9f88-3d3907bd4895" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.634102 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9773c5fa71245a921a7c993b56e54bfffe31e532ff0b9ea4ad398b93725f7e05"} Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.634169 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4773b0effe440a524c8382bb47c37b2839d736dfb4ea26c7ae3a826a894deedc"} Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.634361 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.636533 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"6db35b0bde03ead0a5ecb051839cfb7dd6a87126d40d29d3f474c8ca1b1c4cee"} Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.636567 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"66b1dd170c2216410643352bb5d78bf689d4dc2fc85b0d37802bf28f239c9a34"} Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.638033 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"114c65613ac1b1c3fd98ddff99d9e68e0fbcf7f285e30ea7df070f3b81b69753"} Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.638063 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"15e64c1706dd44afc63fd2182ca86597d8d37b981fbf914ec0d02c0fb33adc8e"} Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.665822 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.666159 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.166144419 +0000 UTC m=+151.277563756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.767345 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.767491 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.267473183 +0000 UTC m=+151.378892520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.767612 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.768171 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.268162316 +0000 UTC m=+151.379581643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.869055 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.869244 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.36921894 +0000 UTC m=+151.480638277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.869270 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.869576 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.369568352 +0000 UTC m=+151.480987689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.970251 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.970644 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.470610807 +0000 UTC m=+151.582030144 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.071367 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:27:57 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:27:57 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:27:57 crc kubenswrapper[4804]: healthz check failed Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.071457 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.072122 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.072477 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.572462159 +0000 UTC m=+151.683881556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.173061 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.173307 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.673270895 +0000 UTC m=+151.784690232 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.173627 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.173945 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.673937148 +0000 UTC m=+151.785356485 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.275433 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.275647 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.775617963 +0000 UTC m=+151.887037300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.275802 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.276097 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.776085389 +0000 UTC m=+151.887504726 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.342295 4804 patch_prober.go:28] interesting pod/console-f9d7485db-tz5vz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.342375 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-tz5vz" podUID="9eb6b4b9-9e2e-4f39-9df0-068cfea71701" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.349271 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.349343 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.350585 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.350689 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.353027 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.359146 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.360456 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.377389 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.377556 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.877523257 +0000 UTC m=+151.988942614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.377908 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.378232 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.87822136 +0000 UTC m=+151.989640747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.478592 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.478798 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.478941 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.479076 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.979047228 +0000 UTC m=+152.090466565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.580497 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.580556 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.580607 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.580772 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.580949 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:58.08093226 +0000 UTC m=+152.192351597 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.584913 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-54w49"] Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.586078 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54w49" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.591039 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.601844 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.624469 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-54w49"] Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.632288 4804 patch_prober.go:28] interesting pod/downloads-7954f5f757-w4nl5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.632351 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-w4nl5" podUID="4c36b00a-bd3f-424c-a67b-d828d782e60f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.632373 4804 patch_prober.go:28] interesting pod/downloads-7954f5f757-w4nl5 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.632424 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-w4nl5" podUID="4c36b00a-bd3f-424c-a67b-d828d782e60f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.670474 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.681457 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.681753 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.682034 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-utilities\") pod \"community-operators-54w49\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " pod="openshift-marketplace/community-operators-54w49" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.682062 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-catalog-content\") pod \"community-operators-54w49\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " pod="openshift-marketplace/community-operators-54w49" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.682087 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmjlm\" (UniqueName: \"kubernetes.io/projected/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-kube-api-access-rmjlm\") pod \"community-operators-54w49\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " pod="openshift-marketplace/community-operators-54w49" Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.682234 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:58.182220203 +0000 UTC m=+152.293639540 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.783156 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.783588 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-utilities\") pod \"community-operators-54w49\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " pod="openshift-marketplace/community-operators-54w49" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.783623 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-catalog-content\") pod \"community-operators-54w49\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " pod="openshift-marketplace/community-operators-54w49" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.783667 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmjlm\" (UniqueName: \"kubernetes.io/projected/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-kube-api-access-rmjlm\") pod \"community-operators-54w49\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " pod="openshift-marketplace/community-operators-54w49" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.784869 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-utilities\") pod \"community-operators-54w49\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " pod="openshift-marketplace/community-operators-54w49" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.785207 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-catalog-content\") pod \"community-operators-54w49\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " pod="openshift-marketplace/community-operators-54w49" Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.785444 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:58.28543312 +0000 UTC m=+152.396852457 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.785823 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hpw7w"] Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.786736 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.789775 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.832111 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmjlm\" (UniqueName: \"kubernetes.io/projected/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-kube-api-access-rmjlm\") pod \"community-operators-54w49\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " pod="openshift-marketplace/community-operators-54w49" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.870614 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hpw7w"] Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.886767 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.887031 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6cwx\" (UniqueName: \"kubernetes.io/projected/cbda9f29-b199-4a42-8757-f5ecc90f0437-kube-api-access-g6cwx\") pod \"certified-operators-hpw7w\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.887073 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-catalog-content\") pod \"certified-operators-hpw7w\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.887128 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-utilities\") pod \"certified-operators-hpw7w\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.887280 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:58.387261211 +0000 UTC m=+152.498680548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.900562 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54w49" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.987947 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-catalog-content\") pod \"certified-operators-hpw7w\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.988005 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.988032 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-utilities\") pod \"certified-operators-hpw7w\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.988111 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6cwx\" (UniqueName: \"kubernetes.io/projected/cbda9f29-b199-4a42-8757-f5ecc90f0437-kube-api-access-g6cwx\") pod \"certified-operators-hpw7w\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.988438 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:58.488416559 +0000 UTC m=+152.599835906 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.988477 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-catalog-content\") pod \"certified-operators-hpw7w\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.988517 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-utilities\") pod \"certified-operators-hpw7w\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.989954 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f9k56"] Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.990863 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.012317 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.030233 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f9k56"] Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.049309 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6cwx\" (UniqueName: \"kubernetes.io/projected/cbda9f29-b199-4a42-8757-f5ecc90f0437-kube-api-access-g6cwx\") pod \"certified-operators-hpw7w\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.075368 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:27:58 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:27:58 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:27:58 crc kubenswrapper[4804]: healthz check failed Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.075424 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.089501 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.091090 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-catalog-content\") pod \"community-operators-f9k56\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.091161 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf9xm\" (UniqueName: \"kubernetes.io/projected/dd3f4542-6055-4524-9e05-58b4c9a16e37-kube-api-access-nf9xm\") pod \"community-operators-f9k56\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.091233 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-utilities\") pod \"community-operators-f9k56\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:27:58 crc kubenswrapper[4804]: E0217 13:27:58.091392 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:58.591363707 +0000 UTC m=+152.702783044 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.108875 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.190673 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dfpnq"] Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.193642 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.196971 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-utilities\") pod \"community-operators-f9k56\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.197075 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.197142 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-catalog-content\") pod \"community-operators-f9k56\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.197218 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf9xm\" (UniqueName: \"kubernetes.io/projected/dd3f4542-6055-4524-9e05-58b4c9a16e37-kube-api-access-nf9xm\") pod \"community-operators-f9k56\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:27:58 crc kubenswrapper[4804]: E0217 13:27:58.197497 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:58.697486472 +0000 UTC m=+152.808905809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.201684 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-utilities\") pod \"community-operators-f9k56\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.202024 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-catalog-content\") pod \"community-operators-f9k56\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.223518 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dfpnq"] Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.254581 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf9xm\" (UniqueName: \"kubernetes.io/projected/dd3f4542-6055-4524-9e05-58b4c9a16e37-kube-api-access-nf9xm\") pod \"community-operators-f9k56\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.299830 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.300074 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh9rr\" (UniqueName: \"kubernetes.io/projected/af8f355f-84e5-49b0-83f4-b87ce7bb4015-kube-api-access-hh9rr\") pod \"certified-operators-dfpnq\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.300164 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-catalog-content\") pod \"certified-operators-dfpnq\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.300182 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-utilities\") pod \"certified-operators-dfpnq\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:27:58 crc kubenswrapper[4804]: E0217 13:27:58.300296 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:58.800280885 +0000 UTC m=+152.911700222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.324491 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.371995 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-54w49"] Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.402828 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh9rr\" (UniqueName: \"kubernetes.io/projected/af8f355f-84e5-49b0-83f4-b87ce7bb4015-kube-api-access-hh9rr\") pod \"certified-operators-dfpnq\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.402863 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.402923 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-catalog-content\") pod \"certified-operators-dfpnq\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.402939 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-utilities\") pod \"certified-operators-dfpnq\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.403260 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-utilities\") pod \"certified-operators-dfpnq\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:27:58 crc kubenswrapper[4804]: E0217 13:27:58.403682 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:58.903671998 +0000 UTC m=+153.015091335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.403980 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-catalog-content\") pod \"certified-operators-dfpnq\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.468547 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh9rr\" (UniqueName: \"kubernetes.io/projected/af8f355f-84e5-49b0-83f4-b87ce7bb4015-kube-api-access-hh9rr\") pod \"certified-operators-dfpnq\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.511842 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:58 crc kubenswrapper[4804]: E0217 13:27:58.512347 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:59.012326368 +0000 UTC m=+153.123745715 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.562577 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.613363 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:58 crc kubenswrapper[4804]: E0217 13:27:58.613869 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:59.113857408 +0000 UTC m=+153.225276735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.715532 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:58 crc kubenswrapper[4804]: E0217 13:27:58.715823 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:59.215809784 +0000 UTC m=+153.327229121 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.744574 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hpw7w"] Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.745976 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" event={"ID":"fd882ba0-9d9f-4a38-8a48-ab4d146fff56","Type":"ContainerStarted","Data":"84be3e5e7a29f1e7c2df4e9c48178fc69447e44a3a7ff354079c9086c2b1423d"} Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.746011 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" event={"ID":"fd882ba0-9d9f-4a38-8a48-ab4d146fff56","Type":"ContainerStarted","Data":"fe7424468529c03e2fae2003698ddb34eed0cf212f15f0eadffad5da9e45a22a"} Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.747242 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54w49" event={"ID":"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df","Type":"ContainerStarted","Data":"14bd0e0c6146aca8722f654770d91415f769ddfe462bd310b48fc23e91722dce"} Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.748553 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf","Type":"ContainerStarted","Data":"0268853808d2c1a2c3d8e2668996471a68d66a841ce6ccf78ec063e7971f0d58"} Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.817112 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:58 crc kubenswrapper[4804]: E0217 13:27:58.817462 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:59.317449548 +0000 UTC m=+153.428868885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.921294 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:58 crc kubenswrapper[4804]: E0217 13:27:58.922169 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:59.422143505 +0000 UTC m=+153.533562842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.983408 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f9k56"] Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.023174 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:59 crc kubenswrapper[4804]: E0217 13:27:59.023592 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:59.523575582 +0000 UTC m=+153.634994929 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.042501 4804 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.061955 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.073382 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:27:59 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:27:59 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:27:59 crc kubenswrapper[4804]: healthz check failed Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.073432 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.083438 4804 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-17T13:27:59.042527877Z","Handler":null,"Name":""} Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.094557 4804 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.094594 4804 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.099699 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dfpnq"] Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.124588 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.156330 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.214298 4804 patch_prober.go:28] interesting pod/apiserver-76f77b778f-46w22 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 17 13:27:59 crc kubenswrapper[4804]: [+]log ok Feb 17 13:27:59 crc kubenswrapper[4804]: [+]etcd ok Feb 17 13:27:59 crc kubenswrapper[4804]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 17 13:27:59 crc kubenswrapper[4804]: [+]poststarthook/generic-apiserver-start-informers ok Feb 17 13:27:59 crc kubenswrapper[4804]: [+]poststarthook/max-in-flight-filter ok Feb 17 13:27:59 crc kubenswrapper[4804]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 17 13:27:59 crc kubenswrapper[4804]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 17 13:27:59 crc kubenswrapper[4804]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 17 13:27:59 crc kubenswrapper[4804]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 17 13:27:59 crc kubenswrapper[4804]: [+]poststarthook/project.openshift.io-projectcache ok Feb 17 13:27:59 crc kubenswrapper[4804]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 17 13:27:59 crc kubenswrapper[4804]: [+]poststarthook/openshift.io-startinformers ok Feb 17 13:27:59 crc kubenswrapper[4804]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 17 13:27:59 crc kubenswrapper[4804]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 17 13:27:59 crc kubenswrapper[4804]: livez check failed Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.214353 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-46w22" podUID="3ea797e4-54e0-4063-8d2b-647f6686e2a8" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.226668 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.321367 4804 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.321678 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.361273 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.554605 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.634080 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.761971 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpw7w" event={"ID":"cbda9f29-b199-4a42-8757-f5ecc90f0437","Type":"ContainerStarted","Data":"f8fddc3c1f1b98532bbecd6c7da5c2a2368e8ed8a3bd8f6f7983638879bf50a9"} Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.763564 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54w49" event={"ID":"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df","Type":"ContainerStarted","Data":"631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c"} Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.764389 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9k56" event={"ID":"dd3f4542-6055-4524-9e05-58b4c9a16e37","Type":"ContainerStarted","Data":"21bf4e05af6fa23bdde7a029ebf7c31d1a22cc2791c5a01af78f87549037e881"} Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.770447 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fvtl6"] Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.771583 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.774307 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.775934 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf","Type":"ContainerStarted","Data":"7c78c7947559e8f76292ea42131dae6c0ad7eaf265131a245dfed7a7568f72f2"} Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.777854 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfpnq" event={"ID":"af8f355f-84e5-49b0-83f4-b87ce7bb4015","Type":"ContainerStarted","Data":"be09cbde5111c6442fb7580667b29d0357b1495c50edff7352458e4b0ddab9db"} Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.788515 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fvtl6"] Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.838604 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-utilities\") pod \"redhat-marketplace-fvtl6\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.838735 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdxzj\" (UniqueName: \"kubernetes.io/projected/6a10f4e7-7906-43aa-98fb-e709a71a55d2-kube-api-access-zdxzj\") pod \"redhat-marketplace-fvtl6\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.838926 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-catalog-content\") pod \"redhat-marketplace-fvtl6\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.842844 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggf6k"] Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.939633 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-catalog-content\") pod \"redhat-marketplace-fvtl6\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.939702 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-utilities\") pod \"redhat-marketplace-fvtl6\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.939744 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdxzj\" (UniqueName: \"kubernetes.io/projected/6a10f4e7-7906-43aa-98fb-e709a71a55d2-kube-api-access-zdxzj\") pod \"redhat-marketplace-fvtl6\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.940409 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-catalog-content\") pod \"redhat-marketplace-fvtl6\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.940629 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-utilities\") pod \"redhat-marketplace-fvtl6\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.962306 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdxzj\" (UniqueName: \"kubernetes.io/projected/6a10f4e7-7906-43aa-98fb-e709a71a55d2-kube-api-access-zdxzj\") pod \"redhat-marketplace-fvtl6\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.010492 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.066954 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.071368 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:00 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:00 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:00 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.071429 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.090322 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.100059 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.103950 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.167060 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j44f8"] Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.168346 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.177621 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j44f8"] Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.245092 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsj25\" (UniqueName: \"kubernetes.io/projected/4627be0e-b7ba-4e46-820b-0ce1271ecacb-kube-api-access-rsj25\") pod \"redhat-marketplace-j44f8\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.245216 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-catalog-content\") pod \"redhat-marketplace-j44f8\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.245243 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-utilities\") pod \"redhat-marketplace-j44f8\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.347223 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-catalog-content\") pod \"redhat-marketplace-j44f8\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.347771 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-utilities\") pod \"redhat-marketplace-j44f8\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.347969 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsj25\" (UniqueName: \"kubernetes.io/projected/4627be0e-b7ba-4e46-820b-0ce1271ecacb-kube-api-access-rsj25\") pod \"redhat-marketplace-j44f8\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.348131 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-catalog-content\") pod \"redhat-marketplace-j44f8\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.348561 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-utilities\") pod \"redhat-marketplace-j44f8\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.372862 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsj25\" (UniqueName: \"kubernetes.io/projected/4627be0e-b7ba-4e46-820b-0ce1271ecacb-kube-api-access-rsj25\") pod \"redhat-marketplace-j44f8\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.455527 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fvtl6"] Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.457629 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:28:00 crc kubenswrapper[4804]: W0217 13:28:00.457820 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a10f4e7_7906_43aa_98fb_e709a71a55d2.slice/crio-122644669fc551cce79300f93153f1ee66ee7078e3af8dcd19bd62ec42ba0f74 WatchSource:0}: Error finding container 122644669fc551cce79300f93153f1ee66ee7078e3af8dcd19bd62ec42ba0f74: Status 404 returned error can't find the container with id 122644669fc551cce79300f93153f1ee66ee7078e3af8dcd19bd62ec42ba0f74 Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.504087 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.538704 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.559924 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.564664 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.570516 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.570668 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.571100 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.583261 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.628347 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.653654 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.653764 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.754602 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.754745 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.754818 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.784628 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xf58f"] Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.785727 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.787058 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.788574 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.811780 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xf58f"] Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.828993 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" event={"ID":"b09fea83-e0d3-4a40-b186-8432c3fa7be0","Type":"ContainerStarted","Data":"9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9"} Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.829058 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" event={"ID":"b09fea83-e0d3-4a40-b186-8432c3fa7be0","Type":"ContainerStarted","Data":"4dd741b3c38a0505bebb7c99e18c919af01e075e7767edd7ca2356d4e858351e"} Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.829907 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.831589 4804 generic.go:334] "Generic (PLEG): container finished" podID="dd3f4542-6055-4524-9e05-58b4c9a16e37" containerID="9da518d6a4ba94c30fc4e543aae3a6e806450f9d2bafc8157ce03ab22879d7ef" exitCode=0 Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.831642 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9k56" event={"ID":"dd3f4542-6055-4524-9e05-58b4c9a16e37","Type":"ContainerDied","Data":"9da518d6a4ba94c30fc4e543aae3a6e806450f9d2bafc8157ce03ab22879d7ef"} Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.833313 4804 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.835964 4804 generic.go:334] "Generic (PLEG): container finished" podID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" containerID="0a5fa9448a9b147d71180506aad70bb2187e4381cb523e0918b556f39008479f" exitCode=0 Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.836027 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfpnq" event={"ID":"af8f355f-84e5-49b0-83f4-b87ce7bb4015","Type":"ContainerDied","Data":"0a5fa9448a9b147d71180506aad70bb2187e4381cb523e0918b556f39008479f"} Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.856943 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm9gj\" (UniqueName: \"kubernetes.io/projected/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-kube-api-access-pm9gj\") pod \"redhat-operators-xf58f\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.857084 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-catalog-content\") pod \"redhat-operators-xf58f\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.857139 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-utilities\") pod \"redhat-operators-xf58f\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.877968 4804 generic.go:334] "Generic (PLEG): container finished" podID="cbda9f29-b199-4a42-8757-f5ecc90f0437" containerID="88fbb69755113b5be170561fa4a81a646f0de4078219f8a5bd52d347e075ff44" exitCode=0 Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.878103 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpw7w" event={"ID":"cbda9f29-b199-4a42-8757-f5ecc90f0437","Type":"ContainerDied","Data":"88fbb69755113b5be170561fa4a81a646f0de4078219f8a5bd52d347e075ff44"} Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.888350 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" event={"ID":"fd882ba0-9d9f-4a38-8a48-ab4d146fff56","Type":"ContainerStarted","Data":"447be96020e56044a9ec997c50432488c0c2f1e04113a47213b1169a9b9d44be"} Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.892654 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.893589 4804 generic.go:334] "Generic (PLEG): container finished" podID="3768c453-c58d-4768-9620-a202cbb8ccd8" containerID="4162bfeb135a23379531aee533539dfb67782c33b11814b5fe4b4ead4443c227" exitCode=0 Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.893609 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" event={"ID":"3768c453-c58d-4768-9620-a202cbb8ccd8","Type":"ContainerDied","Data":"4162bfeb135a23379531aee533539dfb67782c33b11814b5fe4b4ead4443c227"} Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.897134 4804 generic.go:334] "Generic (PLEG): container finished" podID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" containerID="631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c" exitCode=0 Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.897352 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54w49" event={"ID":"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df","Type":"ContainerDied","Data":"631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c"} Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.899232 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvtl6" event={"ID":"6a10f4e7-7906-43aa-98fb-e709a71a55d2","Type":"ContainerStarted","Data":"122644669fc551cce79300f93153f1ee66ee7078e3af8dcd19bd62ec42ba0f74"} Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.902298 4804 generic.go:334] "Generic (PLEG): container finished" podID="9d3918ab-cfeb-4e36-82eb-349dd3cf74bf" containerID="7c78c7947559e8f76292ea42131dae6c0ad7eaf265131a245dfed7a7568f72f2" exitCode=0 Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.902897 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf","Type":"ContainerDied","Data":"7c78c7947559e8f76292ea42131dae6c0ad7eaf265131a245dfed7a7568f72f2"} Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.938381 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j44f8"] Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.938382 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" podStartSLOduration=133.93836647 podStartE2EDuration="2m13.93836647s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:28:00.921810705 +0000 UTC m=+155.033230052" watchObservedRunningTime="2026-02-17 13:28:00.93836647 +0000 UTC m=+155.049785807" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.958180 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-catalog-content\") pod \"redhat-operators-xf58f\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.959347 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-catalog-content\") pod \"redhat-operators-xf58f\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.959972 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-utilities\") pod \"redhat-operators-xf58f\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.960106 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm9gj\" (UniqueName: \"kubernetes.io/projected/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-kube-api-access-pm9gj\") pod \"redhat-operators-xf58f\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.961167 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-utilities\") pod \"redhat-operators-xf58f\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.980006 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm9gj\" (UniqueName: \"kubernetes.io/projected/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-kube-api-access-pm9gj\") pod \"redhat-operators-xf58f\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.021687 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" podStartSLOduration=15.02167073 podStartE2EDuration="15.02167073s" podCreationTimestamp="2026-02-17 13:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:28:01.019477726 +0000 UTC m=+155.130897063" watchObservedRunningTime="2026-02-17 13:28:01.02167073 +0000 UTC m=+155.133090057" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.070220 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:01 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:01 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:01 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.070265 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.136689 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 13:28:01 crc kubenswrapper[4804]: W0217 13:28:01.145510 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod725ad1d2_2625_4eeb_b16b_7bc5ecb54c23.slice/crio-e630bb8a35e2d53701229b25e5c5a9539cf7d3f6ca3f79ebd2da67b9da3f9f92 WatchSource:0}: Error finding container e630bb8a35e2d53701229b25e5c5a9539cf7d3f6ca3f79ebd2da67b9da3f9f92: Status 404 returned error can't find the container with id e630bb8a35e2d53701229b25e5c5a9539cf7d3f6ca3f79ebd2da67b9da3f9f92 Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.167497 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c4fxk"] Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.170299 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.178375 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.181258 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c4fxk"] Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.265459 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-utilities\") pod \"redhat-operators-c4fxk\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.265642 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmjk4\" (UniqueName: \"kubernetes.io/projected/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-kube-api-access-jmjk4\") pod \"redhat-operators-c4fxk\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.265803 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-catalog-content\") pod \"redhat-operators-c4fxk\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.367698 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-catalog-content\") pod \"redhat-operators-c4fxk\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.367765 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-utilities\") pod \"redhat-operators-c4fxk\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.367853 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmjk4\" (UniqueName: \"kubernetes.io/projected/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-kube-api-access-jmjk4\") pod \"redhat-operators-c4fxk\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.368175 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-catalog-content\") pod \"redhat-operators-c4fxk\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.368256 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-utilities\") pod \"redhat-operators-c4fxk\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.377422 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xf58f"] Feb 17 13:28:01 crc kubenswrapper[4804]: W0217 13:28:01.384373 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dbfd9db_3d17_44af_ab32_d2f7e7a1fab5.slice/crio-6c2639b1b465093d91b07ae1fd7b695d64615f297ec3d0a8c5e28adb5bb00161 WatchSource:0}: Error finding container 6c2639b1b465093d91b07ae1fd7b695d64615f297ec3d0a8c5e28adb5bb00161: Status 404 returned error can't find the container with id 6c2639b1b465093d91b07ae1fd7b695d64615f297ec3d0a8c5e28adb5bb00161 Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.388972 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmjk4\" (UniqueName: \"kubernetes.io/projected/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-kube-api-access-jmjk4\") pod \"redhat-operators-c4fxk\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.490559 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.722600 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c4fxk"] Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.912333 4804 generic.go:334] "Generic (PLEG): container finished" podID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerID="0a2e7e7738570ae3eaeab55480ad74079b7d1d3eba43c78c570d3a883e196613" exitCode=0 Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.912413 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4fxk" event={"ID":"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd","Type":"ContainerDied","Data":"0a2e7e7738570ae3eaeab55480ad74079b7d1d3eba43c78c570d3a883e196613"} Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.912867 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4fxk" event={"ID":"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd","Type":"ContainerStarted","Data":"7e1b2fb29927815e4957ff56f7ae370566373e378aef77389a1de5a8d2809eef"} Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.914771 4804 generic.go:334] "Generic (PLEG): container finished" podID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" containerID="f15bcf4d827a1a1b8fffb482f5c62a175b13f562fa7361a7f56418b0dbd2dd76" exitCode=0 Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.914881 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xf58f" event={"ID":"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5","Type":"ContainerDied","Data":"f15bcf4d827a1a1b8fffb482f5c62a175b13f562fa7361a7f56418b0dbd2dd76"} Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.914928 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xf58f" event={"ID":"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5","Type":"ContainerStarted","Data":"6c2639b1b465093d91b07ae1fd7b695d64615f297ec3d0a8c5e28adb5bb00161"} Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.921080 4804 generic.go:334] "Generic (PLEG): container finished" podID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" containerID="eafbbbbbea590f0ceb80b9fa9d1c65902281f33937ef810444499dad62ff97c5" exitCode=0 Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.921186 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvtl6" event={"ID":"6a10f4e7-7906-43aa-98fb-e709a71a55d2","Type":"ContainerDied","Data":"eafbbbbbea590f0ceb80b9fa9d1c65902281f33937ef810444499dad62ff97c5"} Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.924688 4804 generic.go:334] "Generic (PLEG): container finished" podID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" containerID="fd63f395d9d2acc2a5229430110a217a86178b2333399d07e264a3b4cbc4fc4b" exitCode=0 Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.924973 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j44f8" event={"ID":"4627be0e-b7ba-4e46-820b-0ce1271ecacb","Type":"ContainerDied","Data":"fd63f395d9d2acc2a5229430110a217a86178b2333399d07e264a3b4cbc4fc4b"} Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.925019 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j44f8" event={"ID":"4627be0e-b7ba-4e46-820b-0ce1271ecacb","Type":"ContainerStarted","Data":"c5910c70e84a82abe005c7000c40085a9ab0598685cbc3225b9df0cad35f66af"} Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.928217 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23","Type":"ContainerStarted","Data":"2d3cec7a2f95695d7c010f1a7f6b64ed68e16ed941a591c50fa5d0451060f1fe"} Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.928251 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23","Type":"ContainerStarted","Data":"e630bb8a35e2d53701229b25e5c5a9539cf7d3f6ca3f79ebd2da67b9da3f9f92"} Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.026364 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.026344592 podStartE2EDuration="2.026344592s" podCreationTimestamp="2026-02-17 13:28:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:28:02.006388723 +0000 UTC m=+156.117808050" watchObservedRunningTime="2026-02-17 13:28:02.026344592 +0000 UTC m=+156.137763929" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.072436 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:02 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:02 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:02 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.072493 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.208552 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.212338 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.283409 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kube-api-access\") pod \"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf\" (UID: \"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf\") " Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.283551 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3768c453-c58d-4768-9620-a202cbb8ccd8-secret-volume\") pod \"3768c453-c58d-4768-9620-a202cbb8ccd8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.283611 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z9lv\" (UniqueName: \"kubernetes.io/projected/3768c453-c58d-4768-9620-a202cbb8ccd8-kube-api-access-7z9lv\") pod \"3768c453-c58d-4768-9620-a202cbb8ccd8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.283691 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3768c453-c58d-4768-9620-a202cbb8ccd8-config-volume\") pod \"3768c453-c58d-4768-9620-a202cbb8ccd8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.283776 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kubelet-dir\") pod \"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf\" (UID: \"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf\") " Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.284265 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9d3918ab-cfeb-4e36-82eb-349dd3cf74bf" (UID: "9d3918ab-cfeb-4e36-82eb-349dd3cf74bf"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.287469 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3768c453-c58d-4768-9620-a202cbb8ccd8-config-volume" (OuterVolumeSpecName: "config-volume") pod "3768c453-c58d-4768-9620-a202cbb8ccd8" (UID: "3768c453-c58d-4768-9620-a202cbb8ccd8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.292353 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9d3918ab-cfeb-4e36-82eb-349dd3cf74bf" (UID: "9d3918ab-cfeb-4e36-82eb-349dd3cf74bf"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.292391 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3768c453-c58d-4768-9620-a202cbb8ccd8-kube-api-access-7z9lv" (OuterVolumeSpecName: "kube-api-access-7z9lv") pod "3768c453-c58d-4768-9620-a202cbb8ccd8" (UID: "3768c453-c58d-4768-9620-a202cbb8ccd8"). InnerVolumeSpecName "kube-api-access-7z9lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.295517 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3768c453-c58d-4768-9620-a202cbb8ccd8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3768c453-c58d-4768-9620-a202cbb8ccd8" (UID: "3768c453-c58d-4768-9620-a202cbb8ccd8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.387471 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.387500 4804 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3768c453-c58d-4768-9620-a202cbb8ccd8-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.387513 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z9lv\" (UniqueName: \"kubernetes.io/projected/3768c453-c58d-4768-9620-a202cbb8ccd8-kube-api-access-7z9lv\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.387523 4804 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3768c453-c58d-4768-9620-a202cbb8ccd8-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.387534 4804 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.937961 4804 generic.go:334] "Generic (PLEG): container finished" podID="725ad1d2-2625-4eeb-b16b-7bc5ecb54c23" containerID="2d3cec7a2f95695d7c010f1a7f6b64ed68e16ed941a591c50fa5d0451060f1fe" exitCode=0 Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.938075 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23","Type":"ContainerDied","Data":"2d3cec7a2f95695d7c010f1a7f6b64ed68e16ed941a591c50fa5d0451060f1fe"} Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.942755 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf","Type":"ContainerDied","Data":"0268853808d2c1a2c3d8e2668996471a68d66a841ce6ccf78ec063e7971f0d58"} Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.942784 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0268853808d2c1a2c3d8e2668996471a68d66a841ce6ccf78ec063e7971f0d58" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.942803 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.947974 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" event={"ID":"3768c453-c58d-4768-9620-a202cbb8ccd8","Type":"ContainerDied","Data":"76ca0c3a1f23c1bfd5829400e9cd39546fdd21f9b110e5b71897bf1278603129"} Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.948041 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76ca0c3a1f23c1bfd5829400e9cd39546fdd21f9b110e5b71897bf1278603129" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.948045 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:28:03 crc kubenswrapper[4804]: I0217 13:28:03.070530 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:03 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:03 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:03 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:03 crc kubenswrapper[4804]: I0217 13:28:03.070643 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:04 crc kubenswrapper[4804]: I0217 13:28:04.069179 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:04 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:04 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:04 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:04 crc kubenswrapper[4804]: I0217 13:28:04.069244 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:04 crc kubenswrapper[4804]: I0217 13:28:04.211440 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:28:04 crc kubenswrapper[4804]: I0217 13:28:04.217615 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:28:04 crc kubenswrapper[4804]: I0217 13:28:04.515777 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 13:28:04 crc kubenswrapper[4804]: I0217 13:28:04.628607 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kube-api-access\") pod \"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23\" (UID: \"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23\") " Feb 17 13:28:04 crc kubenswrapper[4804]: I0217 13:28:04.628755 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kubelet-dir\") pod \"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23\" (UID: \"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23\") " Feb 17 13:28:04 crc kubenswrapper[4804]: I0217 13:28:04.629044 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "725ad1d2-2625-4eeb-b16b-7bc5ecb54c23" (UID: "725ad1d2-2625-4eeb-b16b-7bc5ecb54c23"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:28:04 crc kubenswrapper[4804]: I0217 13:28:04.660367 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "725ad1d2-2625-4eeb-b16b-7bc5ecb54c23" (UID: "725ad1d2-2625-4eeb-b16b-7bc5ecb54c23"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:28:04 crc kubenswrapper[4804]: I0217 13:28:04.730924 4804 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:04 crc kubenswrapper[4804]: I0217 13:28:04.730963 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:05 crc kubenswrapper[4804]: I0217 13:28:05.012929 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23","Type":"ContainerDied","Data":"e630bb8a35e2d53701229b25e5c5a9539cf7d3f6ca3f79ebd2da67b9da3f9f92"} Feb 17 13:28:05 crc kubenswrapper[4804]: I0217 13:28:05.012977 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 13:28:05 crc kubenswrapper[4804]: I0217 13:28:05.012991 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e630bb8a35e2d53701229b25e5c5a9539cf7d3f6ca3f79ebd2da67b9da3f9f92" Feb 17 13:28:05 crc kubenswrapper[4804]: I0217 13:28:05.077502 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:05 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:05 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:05 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:05 crc kubenswrapper[4804]: I0217 13:28:05.083714 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:05 crc kubenswrapper[4804]: I0217 13:28:05.415092 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-d6mxf" Feb 17 13:28:06 crc kubenswrapper[4804]: I0217 13:28:06.068885 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:06 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:06 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:06 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:06 crc kubenswrapper[4804]: I0217 13:28:06.068958 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:07 crc kubenswrapper[4804]: I0217 13:28:07.069070 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:07 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:07 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:07 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:07 crc kubenswrapper[4804]: I0217 13:28:07.069183 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:07 crc kubenswrapper[4804]: I0217 13:28:07.339193 4804 patch_prober.go:28] interesting pod/console-f9d7485db-tz5vz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Feb 17 13:28:07 crc kubenswrapper[4804]: I0217 13:28:07.339288 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-tz5vz" podUID="9eb6b4b9-9e2e-4f39-9df0-068cfea71701" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Feb 17 13:28:07 crc kubenswrapper[4804]: I0217 13:28:07.638625 4804 patch_prober.go:28] interesting pod/downloads-7954f5f757-w4nl5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 17 13:28:07 crc kubenswrapper[4804]: I0217 13:28:07.638749 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-w4nl5" podUID="4c36b00a-bd3f-424c-a67b-d828d782e60f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 17 13:28:07 crc kubenswrapper[4804]: I0217 13:28:07.638939 4804 patch_prober.go:28] interesting pod/downloads-7954f5f757-w4nl5 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 17 13:28:07 crc kubenswrapper[4804]: I0217 13:28:07.638981 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-w4nl5" podUID="4c36b00a-bd3f-424c-a67b-d828d782e60f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 17 13:28:08 crc kubenswrapper[4804]: I0217 13:28:08.069393 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:08 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:08 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:08 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:08 crc kubenswrapper[4804]: I0217 13:28:08.069758 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:09 crc kubenswrapper[4804]: I0217 13:28:09.068923 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:09 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:09 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:09 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:09 crc kubenswrapper[4804]: I0217 13:28:09.069333 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:09 crc kubenswrapper[4804]: I0217 13:28:09.925677 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:28:09 crc kubenswrapper[4804]: I0217 13:28:09.946237 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:28:10 crc kubenswrapper[4804]: I0217 13:28:10.033807 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:28:10 crc kubenswrapper[4804]: I0217 13:28:10.069884 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:10 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:10 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:10 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:10 crc kubenswrapper[4804]: I0217 13:28:10.069955 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:11 crc kubenswrapper[4804]: I0217 13:28:11.069727 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:11 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:11 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:11 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:11 crc kubenswrapper[4804]: I0217 13:28:11.070005 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:12 crc kubenswrapper[4804]: I0217 13:28:12.069247 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:12 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:12 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:12 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:12 crc kubenswrapper[4804]: I0217 13:28:12.069376 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:13 crc kubenswrapper[4804]: I0217 13:28:13.074940 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:28:13 crc kubenswrapper[4804]: I0217 13:28:13.086025 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:28:15 crc kubenswrapper[4804]: I0217 13:28:15.439385 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j8ggj"] Feb 17 13:28:15 crc kubenswrapper[4804]: I0217 13:28:15.440036 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" podUID="1d929eaa-807c-4809-8b8a-78c186418e71" containerName="controller-manager" containerID="cri-o://e85184210391718c97e4b64df6d5ddb787255a643b112757a5bd89ac1f1c1ad2" gracePeriod=30 Feb 17 13:28:15 crc kubenswrapper[4804]: I0217 13:28:15.445806 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq"] Feb 17 13:28:15 crc kubenswrapper[4804]: I0217 13:28:15.446077 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" podUID="faba1ad1-aeda-412d-9824-36cc045bab86" containerName="route-controller-manager" containerID="cri-o://d31353ab3fe48fbdb124d235032b5df5328407038b987a4788426c871ad7a301" gracePeriod=30 Feb 17 13:28:17 crc kubenswrapper[4804]: I0217 13:28:17.119738 4804 generic.go:334] "Generic (PLEG): container finished" podID="1d929eaa-807c-4809-8b8a-78c186418e71" containerID="e85184210391718c97e4b64df6d5ddb787255a643b112757a5bd89ac1f1c1ad2" exitCode=0 Feb 17 13:28:17 crc kubenswrapper[4804]: I0217 13:28:17.119809 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" event={"ID":"1d929eaa-807c-4809-8b8a-78c186418e71","Type":"ContainerDied","Data":"e85184210391718c97e4b64df6d5ddb787255a643b112757a5bd89ac1f1c1ad2"} Feb 17 13:28:17 crc kubenswrapper[4804]: I0217 13:28:17.121118 4804 generic.go:334] "Generic (PLEG): container finished" podID="faba1ad1-aeda-412d-9824-36cc045bab86" containerID="d31353ab3fe48fbdb124d235032b5df5328407038b987a4788426c871ad7a301" exitCode=0 Feb 17 13:28:17 crc kubenswrapper[4804]: I0217 13:28:17.121147 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" event={"ID":"faba1ad1-aeda-412d-9824-36cc045bab86","Type":"ContainerDied","Data":"d31353ab3fe48fbdb124d235032b5df5328407038b987a4788426c871ad7a301"} Feb 17 13:28:17 crc kubenswrapper[4804]: I0217 13:28:17.444182 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:28:17 crc kubenswrapper[4804]: I0217 13:28:17.451039 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:28:17 crc kubenswrapper[4804]: I0217 13:28:17.637132 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-w4nl5" Feb 17 13:28:19 crc kubenswrapper[4804]: I0217 13:28:19.642047 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:28:20 crc kubenswrapper[4804]: I0217 13:28:20.086822 4804 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-j8ggj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Feb 17 13:28:20 crc kubenswrapper[4804]: I0217 13:28:20.087154 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" podUID="1d929eaa-807c-4809-8b8a-78c186418e71" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Feb 17 13:28:21 crc kubenswrapper[4804]: I0217 13:28:21.003471 4804 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-mqkcq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: i/o timeout" start-of-body= Feb 17 13:28:21 crc kubenswrapper[4804]: I0217 13:28:21.003527 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" podUID="faba1ad1-aeda-412d-9824-36cc045bab86" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: i/o timeout" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.766236 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.802629 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8"] Feb 17 13:28:24 crc kubenswrapper[4804]: E0217 13:28:24.803014 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3768c453-c58d-4768-9620-a202cbb8ccd8" containerName="collect-profiles" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.803048 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3768c453-c58d-4768-9620-a202cbb8ccd8" containerName="collect-profiles" Feb 17 13:28:24 crc kubenswrapper[4804]: E0217 13:28:24.803083 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faba1ad1-aeda-412d-9824-36cc045bab86" containerName="route-controller-manager" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.803100 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="faba1ad1-aeda-412d-9824-36cc045bab86" containerName="route-controller-manager" Feb 17 13:28:24 crc kubenswrapper[4804]: E0217 13:28:24.803123 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="725ad1d2-2625-4eeb-b16b-7bc5ecb54c23" containerName="pruner" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.803136 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="725ad1d2-2625-4eeb-b16b-7bc5ecb54c23" containerName="pruner" Feb 17 13:28:24 crc kubenswrapper[4804]: E0217 13:28:24.803158 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d3918ab-cfeb-4e36-82eb-349dd3cf74bf" containerName="pruner" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.803173 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d3918ab-cfeb-4e36-82eb-349dd3cf74bf" containerName="pruner" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.803442 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="faba1ad1-aeda-412d-9824-36cc045bab86" containerName="route-controller-manager" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.803477 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="3768c453-c58d-4768-9620-a202cbb8ccd8" containerName="collect-profiles" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.803505 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="725ad1d2-2625-4eeb-b16b-7bc5ecb54c23" containerName="pruner" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.803527 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d3918ab-cfeb-4e36-82eb-349dd3cf74bf" containerName="pruner" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.804160 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.824884 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8"] Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.847773 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/faba1ad1-aeda-412d-9824-36cc045bab86-serving-cert\") pod \"faba1ad1-aeda-412d-9824-36cc045bab86\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.847838 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wz5b\" (UniqueName: \"kubernetes.io/projected/faba1ad1-aeda-412d-9824-36cc045bab86-kube-api-access-2wz5b\") pod \"faba1ad1-aeda-412d-9824-36cc045bab86\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.847921 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-config\") pod \"faba1ad1-aeda-412d-9824-36cc045bab86\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.847982 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-client-ca\") pod \"faba1ad1-aeda-412d-9824-36cc045bab86\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.848137 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l74sp\" (UniqueName: \"kubernetes.io/projected/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-kube-api-access-l74sp\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.848231 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-client-ca\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.848247 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-config\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.848277 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-serving-cert\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.848836 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-config" (OuterVolumeSpecName: "config") pod "faba1ad1-aeda-412d-9824-36cc045bab86" (UID: "faba1ad1-aeda-412d-9824-36cc045bab86"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.848912 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-client-ca" (OuterVolumeSpecName: "client-ca") pod "faba1ad1-aeda-412d-9824-36cc045bab86" (UID: "faba1ad1-aeda-412d-9824-36cc045bab86"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.860286 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faba1ad1-aeda-412d-9824-36cc045bab86-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "faba1ad1-aeda-412d-9824-36cc045bab86" (UID: "faba1ad1-aeda-412d-9824-36cc045bab86"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.861639 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faba1ad1-aeda-412d-9824-36cc045bab86-kube-api-access-2wz5b" (OuterVolumeSpecName: "kube-api-access-2wz5b") pod "faba1ad1-aeda-412d-9824-36cc045bab86" (UID: "faba1ad1-aeda-412d-9824-36cc045bab86"). InnerVolumeSpecName "kube-api-access-2wz5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.950093 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-client-ca\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.950154 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-config\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.950221 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-serving-cert\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.950258 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l74sp\" (UniqueName: \"kubernetes.io/projected/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-kube-api-access-l74sp\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.950375 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.950397 4804 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.950415 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/faba1ad1-aeda-412d-9824-36cc045bab86-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.950432 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wz5b\" (UniqueName: \"kubernetes.io/projected/faba1ad1-aeda-412d-9824-36cc045bab86-kube-api-access-2wz5b\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.952261 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-config\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.952554 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-client-ca\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.955873 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-serving-cert\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.974840 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l74sp\" (UniqueName: \"kubernetes.io/projected/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-kube-api-access-l74sp\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:25 crc kubenswrapper[4804]: I0217 13:28:25.132362 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:25 crc kubenswrapper[4804]: I0217 13:28:25.176925 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" event={"ID":"faba1ad1-aeda-412d-9824-36cc045bab86","Type":"ContainerDied","Data":"a4b6cbfefaf077ffe0f3e71671fde2907fe889b88fd4a0d27ee5e5b910c2832f"} Feb 17 13:28:25 crc kubenswrapper[4804]: I0217 13:28:25.176994 4804 scope.go:117] "RemoveContainer" containerID="d31353ab3fe48fbdb124d235032b5df5328407038b987a4788426c871ad7a301" Feb 17 13:28:25 crc kubenswrapper[4804]: I0217 13:28:25.177050 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:28:25 crc kubenswrapper[4804]: I0217 13:28:25.223509 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq"] Feb 17 13:28:25 crc kubenswrapper[4804]: I0217 13:28:25.228135 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq"] Feb 17 13:28:25 crc kubenswrapper[4804]: I0217 13:28:25.835360 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:28:25 crc kubenswrapper[4804]: I0217 13:28:25.835436 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:28:26 crc kubenswrapper[4804]: I0217 13:28:26.581967 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faba1ad1-aeda-412d-9824-36cc045bab86" path="/var/lib/kubelet/pods/faba1ad1-aeda-412d-9824-36cc045bab86/volumes" Feb 17 13:28:30 crc kubenswrapper[4804]: I0217 13:28:30.258749 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" Feb 17 13:28:31 crc kubenswrapper[4804]: I0217 13:28:31.086508 4804 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-j8ggj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 13:28:31 crc kubenswrapper[4804]: I0217 13:28:31.086564 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" podUID="1d929eaa-807c-4809-8b8a-78c186418e71" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.365112 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.366797 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.367783 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.371099 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.377592 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.500331 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9dda4da8-c5ea-4c8a-8443-d7e31eba95af\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.500461 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9dda4da8-c5ea-4c8a-8443-d7e31eba95af\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.544541 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8"] Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.602118 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9dda4da8-c5ea-4c8a-8443-d7e31eba95af\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.602228 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9dda4da8-c5ea-4c8a-8443-d7e31eba95af\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.602334 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9dda4da8-c5ea-4c8a-8443-d7e31eba95af\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.610703 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4jfgm"] Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.626090 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9dda4da8-c5ea-4c8a-8443-d7e31eba95af\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.692616 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 13:28:36 crc kubenswrapper[4804]: I0217 13:28:36.155911 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:28:40 crc kubenswrapper[4804]: E0217 13:28:40.301689 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 17 13:28:40 crc kubenswrapper[4804]: E0217 13:28:40.303006 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g6cwx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-hpw7w_openshift-marketplace(cbda9f29-b199-4a42-8757-f5ecc90f0437): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 13:28:40 crc kubenswrapper[4804]: E0217 13:28:40.304413 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-hpw7w" podUID="cbda9f29-b199-4a42-8757-f5ecc90f0437" Feb 17 13:28:40 crc kubenswrapper[4804]: I0217 13:28:40.946051 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 13:28:40 crc kubenswrapper[4804]: I0217 13:28:40.946878 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:28:40 crc kubenswrapper[4804]: I0217 13:28:40.958006 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 13:28:41 crc kubenswrapper[4804]: I0217 13:28:41.071864 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:28:41 crc kubenswrapper[4804]: I0217 13:28:41.072221 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kube-api-access\") pod \"installer-9-crc\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:28:41 crc kubenswrapper[4804]: I0217 13:28:41.072272 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-var-lock\") pod \"installer-9-crc\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:28:41 crc kubenswrapper[4804]: I0217 13:28:41.086661 4804 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-j8ggj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 13:28:41 crc kubenswrapper[4804]: I0217 13:28:41.086799 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" podUID="1d929eaa-807c-4809-8b8a-78c186418e71" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 13:28:41 crc kubenswrapper[4804]: I0217 13:28:41.173302 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kube-api-access\") pod \"installer-9-crc\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:28:41 crc kubenswrapper[4804]: I0217 13:28:41.173398 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-var-lock\") pod \"installer-9-crc\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:28:41 crc kubenswrapper[4804]: I0217 13:28:41.173443 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:28:41 crc kubenswrapper[4804]: I0217 13:28:41.173529 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:28:41 crc kubenswrapper[4804]: I0217 13:28:41.173579 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-var-lock\") pod \"installer-9-crc\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:28:41 crc kubenswrapper[4804]: I0217 13:28:41.193308 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kube-api-access\") pod \"installer-9-crc\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:28:42 crc kubenswrapper[4804]: I0217 13:28:42.129569 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:28:42 crc kubenswrapper[4804]: E0217 13:28:42.193604 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-hpw7w" podUID="cbda9f29-b199-4a42-8757-f5ecc90f0437" Feb 17 13:28:42 crc kubenswrapper[4804]: E0217 13:28:42.274354 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 17 13:28:42 crc kubenswrapper[4804]: E0217 13:28:42.274545 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hh9rr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-dfpnq_openshift-marketplace(af8f355f-84e5-49b0-83f4-b87ce7bb4015): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 13:28:42 crc kubenswrapper[4804]: E0217 13:28:42.276568 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-dfpnq" podUID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" Feb 17 13:28:42 crc kubenswrapper[4804]: E0217 13:28:42.481559 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 17 13:28:42 crc kubenswrapper[4804]: E0217 13:28:42.481712 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jmjk4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-c4fxk_openshift-marketplace(3d715b9f-61c8-4851-a4b1-452f9f3ea8bd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 13:28:42 crc kubenswrapper[4804]: E0217 13:28:42.483053 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-c4fxk" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" Feb 17 13:28:44 crc kubenswrapper[4804]: E0217 13:28:44.436620 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 17 13:28:44 crc kubenswrapper[4804]: E0217 13:28:44.437094 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nf9xm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-f9k56_openshift-marketplace(dd3f4542-6055-4524-9e05-58b4c9a16e37): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 13:28:44 crc kubenswrapper[4804]: E0217 13:28:44.438289 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-f9k56" podUID="dd3f4542-6055-4524-9e05-58b4c9a16e37" Feb 17 13:28:45 crc kubenswrapper[4804]: E0217 13:28:45.286138 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 17 13:28:45 crc kubenswrapper[4804]: E0217 13:28:45.286345 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pm9gj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-xf58f_openshift-marketplace(4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 13:28:45 crc kubenswrapper[4804]: E0217 13:28:45.288612 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-xf58f" podUID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" Feb 17 13:28:48 crc kubenswrapper[4804]: E0217 13:28:48.041239 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-xf58f" podUID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" Feb 17 13:28:48 crc kubenswrapper[4804]: E0217 13:28:48.041738 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-f9k56" podUID="dd3f4542-6055-4524-9e05-58b4c9a16e37" Feb 17 13:28:48 crc kubenswrapper[4804]: E0217 13:28:48.041823 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-dfpnq" podUID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" Feb 17 13:28:48 crc kubenswrapper[4804]: E0217 13:28:48.041901 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-c4fxk" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.097716 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.130043 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66f58dbd5-dlsdn"] Feb 17 13:28:48 crc kubenswrapper[4804]: E0217 13:28:48.130278 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d929eaa-807c-4809-8b8a-78c186418e71" containerName="controller-manager" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.130289 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d929eaa-807c-4809-8b8a-78c186418e71" containerName="controller-manager" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.130385 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d929eaa-807c-4809-8b8a-78c186418e71" containerName="controller-manager" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.130759 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.139714 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66f58dbd5-dlsdn"] Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.166493 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" event={"ID":"e77722ba-d383-442c-b6dc-9983cf233257","Type":"ContainerStarted","Data":"828853758eab48da037c771cf13c8e4fb60cb60ab76a545908fd820fdf6be8a4"} Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.168266 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" event={"ID":"1d929eaa-807c-4809-8b8a-78c186418e71","Type":"ContainerDied","Data":"c622d293f5967334c96859bbffeac805786523250407581ac4cdc458a4cd4b45"} Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.168364 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.215011 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-proxy-ca-bundles\") pod \"1d929eaa-807c-4809-8b8a-78c186418e71\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.215084 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d929eaa-807c-4809-8b8a-78c186418e71-serving-cert\") pod \"1d929eaa-807c-4809-8b8a-78c186418e71\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.215123 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x554\" (UniqueName: \"kubernetes.io/projected/1d929eaa-807c-4809-8b8a-78c186418e71-kube-api-access-8x554\") pod \"1d929eaa-807c-4809-8b8a-78c186418e71\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.215156 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-client-ca\") pod \"1d929eaa-807c-4809-8b8a-78c186418e71\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.215246 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-config\") pod \"1d929eaa-807c-4809-8b8a-78c186418e71\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.216037 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1d929eaa-807c-4809-8b8a-78c186418e71" (UID: "1d929eaa-807c-4809-8b8a-78c186418e71"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.216144 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-client-ca" (OuterVolumeSpecName: "client-ca") pod "1d929eaa-807c-4809-8b8a-78c186418e71" (UID: "1d929eaa-807c-4809-8b8a-78c186418e71"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.216310 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-config" (OuterVolumeSpecName: "config") pod "1d929eaa-807c-4809-8b8a-78c186418e71" (UID: "1d929eaa-807c-4809-8b8a-78c186418e71"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.221782 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d929eaa-807c-4809-8b8a-78c186418e71-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1d929eaa-807c-4809-8b8a-78c186418e71" (UID: "1d929eaa-807c-4809-8b8a-78c186418e71"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.222187 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d929eaa-807c-4809-8b8a-78c186418e71-kube-api-access-8x554" (OuterVolumeSpecName: "kube-api-access-8x554") pod "1d929eaa-807c-4809-8b8a-78c186418e71" (UID: "1d929eaa-807c-4809-8b8a-78c186418e71"). InnerVolumeSpecName "kube-api-access-8x554". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.316973 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-config\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.317853 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-proxy-ca-bundles\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.317929 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cpwq\" (UniqueName: \"kubernetes.io/projected/9631847b-1aa3-4bbd-95d4-cee45d896b11-kube-api-access-7cpwq\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.318108 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9631847b-1aa3-4bbd-95d4-cee45d896b11-serving-cert\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.318226 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-client-ca\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.318297 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.318315 4804 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.318330 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d929eaa-807c-4809-8b8a-78c186418e71-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.318346 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x554\" (UniqueName: \"kubernetes.io/projected/1d929eaa-807c-4809-8b8a-78c186418e71-kube-api-access-8x554\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.318359 4804 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.419256 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-proxy-ca-bundles\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.419326 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cpwq\" (UniqueName: \"kubernetes.io/projected/9631847b-1aa3-4bbd-95d4-cee45d896b11-kube-api-access-7cpwq\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.419413 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9631847b-1aa3-4bbd-95d4-cee45d896b11-serving-cert\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.419470 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-client-ca\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.419543 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-config\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.510297 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j8ggj"] Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.515274 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j8ggj"] Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.581587 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d929eaa-807c-4809-8b8a-78c186418e71" path="/var/lib/kubelet/pods/1d929eaa-807c-4809-8b8a-78c186418e71/volumes" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.610785 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-client-ca\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.610825 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-proxy-ca-bundles\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.611644 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-config\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.613741 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9631847b-1aa3-4bbd-95d4-cee45d896b11-serving-cert\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.614937 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cpwq\" (UniqueName: \"kubernetes.io/projected/9631847b-1aa3-4bbd-95d4-cee45d896b11-kube-api-access-7cpwq\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.910189 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:49 crc kubenswrapper[4804]: E0217 13:28:49.022382 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 17 13:28:49 crc kubenswrapper[4804]: E0217 13:28:49.022589 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zdxzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-fvtl6_openshift-marketplace(6a10f4e7-7906-43aa-98fb-e709a71a55d2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 13:28:49 crc kubenswrapper[4804]: E0217 13:28:49.023874 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-fvtl6" podUID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" Feb 17 13:28:49 crc kubenswrapper[4804]: I0217 13:28:49.050491 4804 scope.go:117] "RemoveContainer" containerID="e85184210391718c97e4b64df6d5ddb787255a643b112757a5bd89ac1f1c1ad2" Feb 17 13:28:49 crc kubenswrapper[4804]: E0217 13:28:49.217128 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-fvtl6" podUID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" Feb 17 13:28:49 crc kubenswrapper[4804]: I0217 13:28:49.299447 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8"] Feb 17 13:28:49 crc kubenswrapper[4804]: I0217 13:28:49.515749 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66f58dbd5-dlsdn"] Feb 17 13:28:49 crc kubenswrapper[4804]: W0217 13:28:49.529217 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9631847b_1aa3_4bbd_95d4_cee45d896b11.slice/crio-2e84da0c7befea7833b925b3ff40e336177c9ccd82633eca63155bf470709de5 WatchSource:0}: Error finding container 2e84da0c7befea7833b925b3ff40e336177c9ccd82633eca63155bf470709de5: Status 404 returned error can't find the container with id 2e84da0c7befea7833b925b3ff40e336177c9ccd82633eca63155bf470709de5 Feb 17 13:28:49 crc kubenswrapper[4804]: I0217 13:28:49.568446 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 13:28:49 crc kubenswrapper[4804]: W0217 13:28:49.569568 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1c7ffc91_beb4_48c9_bd6a_3432eb40cb18.slice/crio-c1dfeb9653f0eeda2384bf17db4b08547b2f5b249fa71fc9c90c6e17ca6502f1 WatchSource:0}: Error finding container c1dfeb9653f0eeda2384bf17db4b08547b2f5b249fa71fc9c90c6e17ca6502f1: Status 404 returned error can't find the container with id c1dfeb9653f0eeda2384bf17db4b08547b2f5b249fa71fc9c90c6e17ca6502f1 Feb 17 13:28:49 crc kubenswrapper[4804]: E0217 13:28:49.638382 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 17 13:28:49 crc kubenswrapper[4804]: E0217 13:28:49.638895 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rsj25,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-j44f8_openshift-marketplace(4627be0e-b7ba-4e46-820b-0ce1271ecacb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 13:28:49 crc kubenswrapper[4804]: E0217 13:28:49.640037 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-j44f8" podUID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" Feb 17 13:28:49 crc kubenswrapper[4804]: I0217 13:28:49.671600 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 13:28:49 crc kubenswrapper[4804]: W0217 13:28:49.691155 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9dda4da8_c5ea_4c8a_8443_d7e31eba95af.slice/crio-e438a86f0fff2efb5b7b2bf2b31c131dd07258affb80be6cb3518cd31c2549e1 WatchSource:0}: Error finding container e438a86f0fff2efb5b7b2bf2b31c131dd07258affb80be6cb3518cd31c2549e1: Status 404 returned error can't find the container with id e438a86f0fff2efb5b7b2bf2b31c131dd07258affb80be6cb3518cd31c2549e1 Feb 17 13:28:50 crc kubenswrapper[4804]: I0217 13:28:50.185634 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9dda4da8-c5ea-4c8a-8443-d7e31eba95af","Type":"ContainerStarted","Data":"e438a86f0fff2efb5b7b2bf2b31c131dd07258affb80be6cb3518cd31c2549e1"} Feb 17 13:28:50 crc kubenswrapper[4804]: I0217 13:28:50.188890 4804 generic.go:334] "Generic (PLEG): container finished" podID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" containerID="b18e486af5e04ed68c256a5f17c4e5bc48b129f17f0b27df8fa47ac15336cfef" exitCode=0 Feb 17 13:28:50 crc kubenswrapper[4804]: I0217 13:28:50.189009 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54w49" event={"ID":"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df","Type":"ContainerDied","Data":"b18e486af5e04ed68c256a5f17c4e5bc48b129f17f0b27df8fa47ac15336cfef"} Feb 17 13:28:50 crc kubenswrapper[4804]: I0217 13:28:50.191764 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" event={"ID":"e77722ba-d383-442c-b6dc-9983cf233257","Type":"ContainerStarted","Data":"cf47b71406f4bbe5bd193f96862e508e5e5e5461f1ee8dd7f13c9d10769af71a"} Feb 17 13:28:50 crc kubenswrapper[4804]: I0217 13:28:50.196728 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18","Type":"ContainerStarted","Data":"c1dfeb9653f0eeda2384bf17db4b08547b2f5b249fa71fc9c90c6e17ca6502f1"} Feb 17 13:28:50 crc kubenswrapper[4804]: I0217 13:28:50.202490 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" event={"ID":"9631847b-1aa3-4bbd-95d4-cee45d896b11","Type":"ContainerStarted","Data":"2e84da0c7befea7833b925b3ff40e336177c9ccd82633eca63155bf470709de5"} Feb 17 13:28:50 crc kubenswrapper[4804]: I0217 13:28:50.204416 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" event={"ID":"2cc5d152-9369-4574-ab6b-05d9d4c5afd7","Type":"ContainerStarted","Data":"a9ed597c3c00b14d9496b5cdcd3501fa4654fd60a6b054f4df6ff45fd2626a2f"} Feb 17 13:28:50 crc kubenswrapper[4804]: I0217 13:28:50.204498 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" event={"ID":"2cc5d152-9369-4574-ab6b-05d9d4c5afd7","Type":"ContainerStarted","Data":"11a6eeb787a54a0159bc228994e888bc7b3352ae3c3c245dcc87e80f7f925b09"} Feb 17 13:28:50 crc kubenswrapper[4804]: I0217 13:28:50.204619 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" podUID="2cc5d152-9369-4574-ab6b-05d9d4c5afd7" containerName="route-controller-manager" containerID="cri-o://a9ed597c3c00b14d9496b5cdcd3501fa4654fd60a6b054f4df6ff45fd2626a2f" gracePeriod=30 Feb 17 13:28:50 crc kubenswrapper[4804]: E0217 13:28:50.207020 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-j44f8" podUID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" Feb 17 13:28:50 crc kubenswrapper[4804]: I0217 13:28:50.240498 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" podStartSLOduration=35.240475917 podStartE2EDuration="35.240475917s" podCreationTimestamp="2026-02-17 13:28:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:28:50.238131759 +0000 UTC m=+204.349551106" watchObservedRunningTime="2026-02-17 13:28:50.240475917 +0000 UTC m=+204.351895254" Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.213080 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9dda4da8-c5ea-4c8a-8443-d7e31eba95af","Type":"ContainerStarted","Data":"6155d5f4b6c5d243b45066428b06822d531e0daedd2837b5de7761b60473e5c3"} Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.216029 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" event={"ID":"e77722ba-d383-442c-b6dc-9983cf233257","Type":"ContainerStarted","Data":"ac736acdb9f4adaaf3c6fe0c81a3865edef47271655b25ba995d0941cd6f23c6"} Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.220026 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18","Type":"ContainerStarted","Data":"c268cbeacb8edca4cf6be1f9ade9d17e4f9a777b74947e1265bd5b8b02378689"} Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.221801 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" event={"ID":"9631847b-1aa3-4bbd-95d4-cee45d896b11","Type":"ContainerStarted","Data":"cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6"} Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.222152 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.225424 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=16.22540711 podStartE2EDuration="16.22540711s" podCreationTimestamp="2026-02-17 13:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:28:51.224440518 +0000 UTC m=+205.335859855" watchObservedRunningTime="2026-02-17 13:28:51.22540711 +0000 UTC m=+205.336826447" Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.229033 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6bb46c8d9c-hxxp8_2cc5d152-9369-4574-ab6b-05d9d4c5afd7/route-controller-manager/0.log" Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.229091 4804 generic.go:334] "Generic (PLEG): container finished" podID="2cc5d152-9369-4574-ab6b-05d9d4c5afd7" containerID="a9ed597c3c00b14d9496b5cdcd3501fa4654fd60a6b054f4df6ff45fd2626a2f" exitCode=255 Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.229123 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" event={"ID":"2cc5d152-9369-4574-ab6b-05d9d4c5afd7","Type":"ContainerDied","Data":"a9ed597c3c00b14d9496b5cdcd3501fa4654fd60a6b054f4df6ff45fd2626a2f"} Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.230881 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.242469 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4jfgm" podStartSLOduration=184.242447388 podStartE2EDuration="3m4.242447388s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:28:51.240019477 +0000 UTC m=+205.351438824" watchObservedRunningTime="2026-02-17 13:28:51.242447388 +0000 UTC m=+205.353866725" Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.257274 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=11.257255192 podStartE2EDuration="11.257255192s" podCreationTimestamp="2026-02-17 13:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:28:51.255639149 +0000 UTC m=+205.367058486" watchObservedRunningTime="2026-02-17 13:28:51.257255192 +0000 UTC m=+205.368675069" Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.272486 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" podStartSLOduration=16.27247132 podStartE2EDuration="16.27247132s" podCreationTimestamp="2026-02-17 13:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:28:51.270936669 +0000 UTC m=+205.382356016" watchObservedRunningTime="2026-02-17 13:28:51.27247132 +0000 UTC m=+205.383890657" Feb 17 13:28:55 crc kubenswrapper[4804]: I0217 13:28:55.141691 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:55 crc kubenswrapper[4804]: I0217 13:28:55.835870 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:28:55 crc kubenswrapper[4804]: I0217 13:28:55.835986 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:28:55 crc kubenswrapper[4804]: I0217 13:28:55.836066 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:28:55 crc kubenswrapper[4804]: I0217 13:28:55.837061 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 13:28:55 crc kubenswrapper[4804]: I0217 13:28:55.837248 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b" gracePeriod=600 Feb 17 13:28:56 crc kubenswrapper[4804]: I0217 13:28:56.141865 4804 patch_prober.go:28] interesting pod/route-controller-manager-6bb46c8d9c-hxxp8 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 13:28:56 crc kubenswrapper[4804]: I0217 13:28:56.142455 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" podUID="2cc5d152-9369-4574-ab6b-05d9d4c5afd7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.270998 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6bb46c8d9c-hxxp8_2cc5d152-9369-4574-ab6b-05d9d4c5afd7/route-controller-manager/0.log" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.271056 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" event={"ID":"2cc5d152-9369-4574-ab6b-05d9d4c5afd7","Type":"ContainerDied","Data":"11a6eeb787a54a0159bc228994e888bc7b3352ae3c3c245dcc87e80f7f925b09"} Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.271085 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11a6eeb787a54a0159bc228994e888bc7b3352ae3c3c245dcc87e80f7f925b09" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.311269 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6bb46c8d9c-hxxp8_2cc5d152-9369-4574-ab6b-05d9d4c5afd7/route-controller-manager/0.log" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.311344 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.378956 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k"] Feb 17 13:28:57 crc kubenswrapper[4804]: E0217 13:28:57.379243 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cc5d152-9369-4574-ab6b-05d9d4c5afd7" containerName="route-controller-manager" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.379262 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cc5d152-9369-4574-ab6b-05d9d4c5afd7" containerName="route-controller-manager" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.379386 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cc5d152-9369-4574-ab6b-05d9d4c5afd7" containerName="route-controller-manager" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.379806 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.388944 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k"] Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.471928 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-client-ca\") pod \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.472285 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-serving-cert\") pod \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.472442 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-config\") pod \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.472559 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l74sp\" (UniqueName: \"kubernetes.io/projected/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-kube-api-access-l74sp\") pod \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.472796 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b710ce8a-f177-4c60-b8d5-bbf18bf38737-serving-cert\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.472869 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-client-ca" (OuterVolumeSpecName: "client-ca") pod "2cc5d152-9369-4574-ab6b-05d9d4c5afd7" (UID: "2cc5d152-9369-4574-ab6b-05d9d4c5afd7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.472995 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-config" (OuterVolumeSpecName: "config") pod "2cc5d152-9369-4574-ab6b-05d9d4c5afd7" (UID: "2cc5d152-9369-4574-ab6b-05d9d4c5afd7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.473025 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bxfh\" (UniqueName: \"kubernetes.io/projected/b710ce8a-f177-4c60-b8d5-bbf18bf38737-kube-api-access-7bxfh\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.473146 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-config\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.473196 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-client-ca\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.473303 4804 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.473320 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.477408 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-kube-api-access-l74sp" (OuterVolumeSpecName: "kube-api-access-l74sp") pod "2cc5d152-9369-4574-ab6b-05d9d4c5afd7" (UID: "2cc5d152-9369-4574-ab6b-05d9d4c5afd7"). InnerVolumeSpecName "kube-api-access-l74sp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.478348 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2cc5d152-9369-4574-ab6b-05d9d4c5afd7" (UID: "2cc5d152-9369-4574-ab6b-05d9d4c5afd7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.574240 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-client-ca\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.574312 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b710ce8a-f177-4c60-b8d5-bbf18bf38737-serving-cert\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.574350 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bxfh\" (UniqueName: \"kubernetes.io/projected/b710ce8a-f177-4c60-b8d5-bbf18bf38737-kube-api-access-7bxfh\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.574382 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-config\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.574543 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.574945 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l74sp\" (UniqueName: \"kubernetes.io/projected/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-kube-api-access-l74sp\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.575490 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-client-ca\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.576567 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-config\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.579439 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b710ce8a-f177-4c60-b8d5-bbf18bf38737-serving-cert\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.591411 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bxfh\" (UniqueName: \"kubernetes.io/projected/b710ce8a-f177-4c60-b8d5-bbf18bf38737-kube-api-access-7bxfh\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.700768 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:58 crc kubenswrapper[4804]: I0217 13:28:58.277249 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:58 crc kubenswrapper[4804]: I0217 13:28:58.314556 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8"] Feb 17 13:28:58 crc kubenswrapper[4804]: I0217 13:28:58.317540 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8"] Feb 17 13:28:58 crc kubenswrapper[4804]: I0217 13:28:58.474544 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k"] Feb 17 13:28:58 crc kubenswrapper[4804]: I0217 13:28:58.582603 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cc5d152-9369-4574-ab6b-05d9d4c5afd7" path="/var/lib/kubelet/pods/2cc5d152-9369-4574-ab6b-05d9d4c5afd7/volumes" Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.298348 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b" exitCode=0 Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.298880 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b"} Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.298911 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"45ca7a269d38a09ffdae5bd556d7eb92def23c8a4cf6319c270308b20dd056c7"} Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.301869 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpw7w" event={"ID":"cbda9f29-b199-4a42-8757-f5ecc90f0437","Type":"ContainerStarted","Data":"3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19"} Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.304936 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" event={"ID":"b710ce8a-f177-4c60-b8d5-bbf18bf38737","Type":"ContainerStarted","Data":"1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1"} Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.304990 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" event={"ID":"b710ce8a-f177-4c60-b8d5-bbf18bf38737","Type":"ContainerStarted","Data":"558d5dd2eecf846742fd5b4dd243c32953c0fb248ec2faa9cde568927170e4d7"} Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.305790 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.307632 4804 generic.go:334] "Generic (PLEG): container finished" podID="9dda4da8-c5ea-4c8a-8443-d7e31eba95af" containerID="6155d5f4b6c5d243b45066428b06822d531e0daedd2837b5de7761b60473e5c3" exitCode=0 Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.307688 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9dda4da8-c5ea-4c8a-8443-d7e31eba95af","Type":"ContainerDied","Data":"6155d5f4b6c5d243b45066428b06822d531e0daedd2837b5de7761b60473e5c3"} Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.309105 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54w49" event={"ID":"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df","Type":"ContainerStarted","Data":"b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd"} Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.315260 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.366057 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-54w49" podStartSLOduration=4.762243784 podStartE2EDuration="1m2.366041176s" podCreationTimestamp="2026-02-17 13:27:57 +0000 UTC" firstStartedPulling="2026-02-17 13:28:00.907756174 +0000 UTC m=+155.019175511" lastFinishedPulling="2026-02-17 13:28:58.511553566 +0000 UTC m=+212.622972903" observedRunningTime="2026-02-17 13:28:59.363975367 +0000 UTC m=+213.475394704" watchObservedRunningTime="2026-02-17 13:28:59.366041176 +0000 UTC m=+213.477460513" Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.387043 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" podStartSLOduration=24.387022166 podStartE2EDuration="24.387022166s" podCreationTimestamp="2026-02-17 13:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:28:59.381490801 +0000 UTC m=+213.492910148" watchObservedRunningTime="2026-02-17 13:28:59.387022166 +0000 UTC m=+213.498441503" Feb 17 13:29:00 crc kubenswrapper[4804]: I0217 13:29:00.315611 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xf58f" event={"ID":"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5","Type":"ContainerStarted","Data":"de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e"} Feb 17 13:29:00 crc kubenswrapper[4804]: I0217 13:29:00.320289 4804 generic.go:334] "Generic (PLEG): container finished" podID="cbda9f29-b199-4a42-8757-f5ecc90f0437" containerID="3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19" exitCode=0 Feb 17 13:29:00 crc kubenswrapper[4804]: I0217 13:29:00.320487 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpw7w" event={"ID":"cbda9f29-b199-4a42-8757-f5ecc90f0437","Type":"ContainerDied","Data":"3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19"} Feb 17 13:29:00 crc kubenswrapper[4804]: I0217 13:29:00.631530 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 13:29:00 crc kubenswrapper[4804]: I0217 13:29:00.826715 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kube-api-access\") pod \"9dda4da8-c5ea-4c8a-8443-d7e31eba95af\" (UID: \"9dda4da8-c5ea-4c8a-8443-d7e31eba95af\") " Feb 17 13:29:00 crc kubenswrapper[4804]: I0217 13:29:00.827133 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kubelet-dir\") pod \"9dda4da8-c5ea-4c8a-8443-d7e31eba95af\" (UID: \"9dda4da8-c5ea-4c8a-8443-d7e31eba95af\") " Feb 17 13:29:00 crc kubenswrapper[4804]: I0217 13:29:00.827458 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9dda4da8-c5ea-4c8a-8443-d7e31eba95af" (UID: "9dda4da8-c5ea-4c8a-8443-d7e31eba95af"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:29:00 crc kubenswrapper[4804]: I0217 13:29:00.832488 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9dda4da8-c5ea-4c8a-8443-d7e31eba95af" (UID: "9dda4da8-c5ea-4c8a-8443-d7e31eba95af"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:29:00 crc kubenswrapper[4804]: I0217 13:29:00.929431 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:00 crc kubenswrapper[4804]: I0217 13:29:00.929493 4804 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:01 crc kubenswrapper[4804]: I0217 13:29:01.326858 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 13:29:01 crc kubenswrapper[4804]: I0217 13:29:01.327019 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9dda4da8-c5ea-4c8a-8443-d7e31eba95af","Type":"ContainerDied","Data":"e438a86f0fff2efb5b7b2bf2b31c131dd07258affb80be6cb3518cd31c2549e1"} Feb 17 13:29:01 crc kubenswrapper[4804]: I0217 13:29:01.327061 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e438a86f0fff2efb5b7b2bf2b31c131dd07258affb80be6cb3518cd31c2549e1" Feb 17 13:29:01 crc kubenswrapper[4804]: I0217 13:29:01.328759 4804 generic.go:334] "Generic (PLEG): container finished" podID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" containerID="de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e" exitCode=0 Feb 17 13:29:01 crc kubenswrapper[4804]: I0217 13:29:01.328804 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xf58f" event={"ID":"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5","Type":"ContainerDied","Data":"de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e"} Feb 17 13:29:01 crc kubenswrapper[4804]: I0217 13:29:01.332098 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9k56" event={"ID":"dd3f4542-6055-4524-9e05-58b4c9a16e37","Type":"ContainerDied","Data":"01f4c85dd4ed77fe0ae4fd3853fe066d8a2f72e40a5062bcaf2b92496b6c83fc"} Feb 17 13:29:01 crc kubenswrapper[4804]: I0217 13:29:01.332002 4804 generic.go:334] "Generic (PLEG): container finished" podID="dd3f4542-6055-4524-9e05-58b4c9a16e37" containerID="01f4c85dd4ed77fe0ae4fd3853fe066d8a2f72e40a5062bcaf2b92496b6c83fc" exitCode=0 Feb 17 13:29:02 crc kubenswrapper[4804]: I0217 13:29:02.353113 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9k56" event={"ID":"dd3f4542-6055-4524-9e05-58b4c9a16e37","Type":"ContainerStarted","Data":"32268ca4bac2c1ba5e24c19291b99aeb8559e595975744c5d7c0ae06ab41c88b"} Feb 17 13:29:02 crc kubenswrapper[4804]: I0217 13:29:02.357105 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpw7w" event={"ID":"cbda9f29-b199-4a42-8757-f5ecc90f0437","Type":"ContainerStarted","Data":"0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77"} Feb 17 13:29:02 crc kubenswrapper[4804]: I0217 13:29:02.361723 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xf58f" event={"ID":"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5","Type":"ContainerStarted","Data":"6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240"} Feb 17 13:29:02 crc kubenswrapper[4804]: I0217 13:29:02.374815 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f9k56" podStartSLOduration=4.169809769 podStartE2EDuration="1m5.374796065s" podCreationTimestamp="2026-02-17 13:27:57 +0000 UTC" firstStartedPulling="2026-02-17 13:28:00.833023831 +0000 UTC m=+154.944443168" lastFinishedPulling="2026-02-17 13:29:02.038010127 +0000 UTC m=+216.149429464" observedRunningTime="2026-02-17 13:29:02.374190334 +0000 UTC m=+216.485609681" watchObservedRunningTime="2026-02-17 13:29:02.374796065 +0000 UTC m=+216.486215402" Feb 17 13:29:02 crc kubenswrapper[4804]: I0217 13:29:02.398162 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hpw7w" podStartSLOduration=4.843019966 podStartE2EDuration="1m5.398146424s" podCreationTimestamp="2026-02-17 13:27:57 +0000 UTC" firstStartedPulling="2026-02-17 13:28:00.880369227 +0000 UTC m=+154.991788564" lastFinishedPulling="2026-02-17 13:29:01.435495685 +0000 UTC m=+215.546915022" observedRunningTime="2026-02-17 13:29:02.397362858 +0000 UTC m=+216.508782195" watchObservedRunningTime="2026-02-17 13:29:02.398146424 +0000 UTC m=+216.509565761" Feb 17 13:29:02 crc kubenswrapper[4804]: I0217 13:29:02.420684 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xf58f" podStartSLOduration=2.198926656 podStartE2EDuration="1m2.420664535s" podCreationTimestamp="2026-02-17 13:28:00 +0000 UTC" firstStartedPulling="2026-02-17 13:28:01.91673789 +0000 UTC m=+156.028157217" lastFinishedPulling="2026-02-17 13:29:02.138475759 +0000 UTC m=+216.249895096" observedRunningTime="2026-02-17 13:29:02.419825327 +0000 UTC m=+216.531244664" watchObservedRunningTime="2026-02-17 13:29:02.420664535 +0000 UTC m=+216.532083872" Feb 17 13:29:03 crc kubenswrapper[4804]: I0217 13:29:03.371067 4804 generic.go:334] "Generic (PLEG): container finished" podID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" containerID="a14465d915fa294528de1e1a532d12f42a2b05c614c04dfaa5801608931bc3fa" exitCode=0 Feb 17 13:29:03 crc kubenswrapper[4804]: I0217 13:29:03.371159 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j44f8" event={"ID":"4627be0e-b7ba-4e46-820b-0ce1271ecacb","Type":"ContainerDied","Data":"a14465d915fa294528de1e1a532d12f42a2b05c614c04dfaa5801608931bc3fa"} Feb 17 13:29:03 crc kubenswrapper[4804]: I0217 13:29:03.376492 4804 generic.go:334] "Generic (PLEG): container finished" podID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" containerID="6b673b46083a7a7e870939da823bebf898513e413a5e11d451d621999b90a4eb" exitCode=0 Feb 17 13:29:03 crc kubenswrapper[4804]: I0217 13:29:03.376531 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfpnq" event={"ID":"af8f355f-84e5-49b0-83f4-b87ce7bb4015","Type":"ContainerDied","Data":"6b673b46083a7a7e870939da823bebf898513e413a5e11d451d621999b90a4eb"} Feb 17 13:29:04 crc kubenswrapper[4804]: I0217 13:29:04.383780 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j44f8" event={"ID":"4627be0e-b7ba-4e46-820b-0ce1271ecacb","Type":"ContainerStarted","Data":"f8d7f73baa1032b6de41da56ddc6f1f2dec8f46b8ff8b6b1cc83c93dff54365f"} Feb 17 13:29:04 crc kubenswrapper[4804]: I0217 13:29:04.386075 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4fxk" event={"ID":"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd","Type":"ContainerStarted","Data":"e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef"} Feb 17 13:29:04 crc kubenswrapper[4804]: I0217 13:29:04.388921 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfpnq" event={"ID":"af8f355f-84e5-49b0-83f4-b87ce7bb4015","Type":"ContainerStarted","Data":"503926379bfc61a672b44215088d72cfe3108d43867dcdd3e3945371b4cab72f"} Feb 17 13:29:04 crc kubenswrapper[4804]: I0217 13:29:04.437745 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dfpnq" podStartSLOduration=3.435925797 podStartE2EDuration="1m6.437726475s" podCreationTimestamp="2026-02-17 13:27:58 +0000 UTC" firstStartedPulling="2026-02-17 13:28:00.83746957 +0000 UTC m=+154.948888907" lastFinishedPulling="2026-02-17 13:29:03.839270248 +0000 UTC m=+217.950689585" observedRunningTime="2026-02-17 13:29:04.436011218 +0000 UTC m=+218.547430565" watchObservedRunningTime="2026-02-17 13:29:04.437726475 +0000 UTC m=+218.549145812" Feb 17 13:29:04 crc kubenswrapper[4804]: I0217 13:29:04.438870 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j44f8" podStartSLOduration=2.3553082659999998 podStartE2EDuration="1m4.438863324s" podCreationTimestamp="2026-02-17 13:28:00 +0000 UTC" firstStartedPulling="2026-02-17 13:28:01.926532839 +0000 UTC m=+156.037952176" lastFinishedPulling="2026-02-17 13:29:04.010087897 +0000 UTC m=+218.121507234" observedRunningTime="2026-02-17 13:29:04.408581083 +0000 UTC m=+218.520000420" watchObservedRunningTime="2026-02-17 13:29:04.438863324 +0000 UTC m=+218.550282661" Feb 17 13:29:05 crc kubenswrapper[4804]: I0217 13:29:05.396695 4804 generic.go:334] "Generic (PLEG): container finished" podID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerID="e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef" exitCode=0 Feb 17 13:29:05 crc kubenswrapper[4804]: I0217 13:29:05.396762 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4fxk" event={"ID":"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd","Type":"ContainerDied","Data":"e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef"} Feb 17 13:29:06 crc kubenswrapper[4804]: I0217 13:29:06.405617 4804 generic.go:334] "Generic (PLEG): container finished" podID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" containerID="75097ed47129979b31be1fb063b6f429a2375e95db196451ba1b30153d8fc201" exitCode=0 Feb 17 13:29:06 crc kubenswrapper[4804]: I0217 13:29:06.406254 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvtl6" event={"ID":"6a10f4e7-7906-43aa-98fb-e709a71a55d2","Type":"ContainerDied","Data":"75097ed47129979b31be1fb063b6f429a2375e95db196451ba1b30153d8fc201"} Feb 17 13:29:06 crc kubenswrapper[4804]: I0217 13:29:06.410687 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4fxk" event={"ID":"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd","Type":"ContainerStarted","Data":"79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f"} Feb 17 13:29:06 crc kubenswrapper[4804]: I0217 13:29:06.441451 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c4fxk" podStartSLOduration=1.5327347630000001 podStartE2EDuration="1m5.44143441s" podCreationTimestamp="2026-02-17 13:28:01 +0000 UTC" firstStartedPulling="2026-02-17 13:28:01.914457944 +0000 UTC m=+156.025877281" lastFinishedPulling="2026-02-17 13:29:05.823157591 +0000 UTC m=+219.934576928" observedRunningTime="2026-02-17 13:29:06.440472498 +0000 UTC m=+220.551891845" watchObservedRunningTime="2026-02-17 13:29:06.44143441 +0000 UTC m=+220.552853747" Feb 17 13:29:07 crc kubenswrapper[4804]: I0217 13:29:07.419853 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvtl6" event={"ID":"6a10f4e7-7906-43aa-98fb-e709a71a55d2","Type":"ContainerStarted","Data":"9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b"} Feb 17 13:29:07 crc kubenswrapper[4804]: I0217 13:29:07.441300 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fvtl6" podStartSLOduration=3.316118946 podStartE2EDuration="1m8.44128093s" podCreationTimestamp="2026-02-17 13:27:59 +0000 UTC" firstStartedPulling="2026-02-17 13:28:01.923078823 +0000 UTC m=+156.034498160" lastFinishedPulling="2026-02-17 13:29:07.048240807 +0000 UTC m=+221.159660144" observedRunningTime="2026-02-17 13:29:07.438623622 +0000 UTC m=+221.550042969" watchObservedRunningTime="2026-02-17 13:29:07.44128093 +0000 UTC m=+221.552700267" Feb 17 13:29:07 crc kubenswrapper[4804]: I0217 13:29:07.901243 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-54w49" Feb 17 13:29:07 crc kubenswrapper[4804]: I0217 13:29:07.901312 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-54w49" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.110614 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bstw9"] Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.111159 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.111289 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.132838 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-54w49" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.180635 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.325931 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.325985 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.384305 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.469947 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.470942 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.478359 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-54w49" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.563731 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.563885 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.611590 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:29:09 crc kubenswrapper[4804]: I0217 13:29:09.468222 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:29:10 crc kubenswrapper[4804]: I0217 13:29:10.104654 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:29:10 crc kubenswrapper[4804]: I0217 13:29:10.104701 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:29:10 crc kubenswrapper[4804]: I0217 13:29:10.160020 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:29:10 crc kubenswrapper[4804]: I0217 13:29:10.504316 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:29:10 crc kubenswrapper[4804]: I0217 13:29:10.504388 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:29:10 crc kubenswrapper[4804]: I0217 13:29:10.551785 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:29:11 crc kubenswrapper[4804]: I0217 13:29:11.180301 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:29:11 crc kubenswrapper[4804]: I0217 13:29:11.180572 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:29:11 crc kubenswrapper[4804]: I0217 13:29:11.234404 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:29:11 crc kubenswrapper[4804]: I0217 13:29:11.481945 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:29:11 crc kubenswrapper[4804]: I0217 13:29:11.482741 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:29:11 crc kubenswrapper[4804]: I0217 13:29:11.491676 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:29:11 crc kubenswrapper[4804]: I0217 13:29:11.491737 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:29:11 crc kubenswrapper[4804]: I0217 13:29:11.610012 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f9k56"] Feb 17 13:29:11 crc kubenswrapper[4804]: I0217 13:29:11.610296 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f9k56" podUID="dd3f4542-6055-4524-9e05-58b4c9a16e37" containerName="registry-server" containerID="cri-o://32268ca4bac2c1ba5e24c19291b99aeb8559e595975744c5d7c0ae06ab41c88b" gracePeriod=2 Feb 17 13:29:12 crc kubenswrapper[4804]: I0217 13:29:12.542172 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c4fxk" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerName="registry-server" probeResult="failure" output=< Feb 17 13:29:12 crc kubenswrapper[4804]: timeout: failed to connect service ":50051" within 1s Feb 17 13:29:12 crc kubenswrapper[4804]: > Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.011685 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dfpnq"] Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.011945 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dfpnq" podUID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" containerName="registry-server" containerID="cri-o://503926379bfc61a672b44215088d72cfe3108d43867dcdd3e3945371b4cab72f" gracePeriod=2 Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.451382 4804 generic.go:334] "Generic (PLEG): container finished" podID="dd3f4542-6055-4524-9e05-58b4c9a16e37" containerID="32268ca4bac2c1ba5e24c19291b99aeb8559e595975744c5d7c0ae06ab41c88b" exitCode=0 Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.451456 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9k56" event={"ID":"dd3f4542-6055-4524-9e05-58b4c9a16e37","Type":"ContainerDied","Data":"32268ca4bac2c1ba5e24c19291b99aeb8559e595975744c5d7c0ae06ab41c88b"} Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.623773 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.797489 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-catalog-content\") pod \"dd3f4542-6055-4524-9e05-58b4c9a16e37\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.797555 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf9xm\" (UniqueName: \"kubernetes.io/projected/dd3f4542-6055-4524-9e05-58b4c9a16e37-kube-api-access-nf9xm\") pod \"dd3f4542-6055-4524-9e05-58b4c9a16e37\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.797676 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-utilities\") pod \"dd3f4542-6055-4524-9e05-58b4c9a16e37\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.798661 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-utilities" (OuterVolumeSpecName: "utilities") pod "dd3f4542-6055-4524-9e05-58b4c9a16e37" (UID: "dd3f4542-6055-4524-9e05-58b4c9a16e37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.806664 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd3f4542-6055-4524-9e05-58b4c9a16e37-kube-api-access-nf9xm" (OuterVolumeSpecName: "kube-api-access-nf9xm") pod "dd3f4542-6055-4524-9e05-58b4c9a16e37" (UID: "dd3f4542-6055-4524-9e05-58b4c9a16e37"). InnerVolumeSpecName "kube-api-access-nf9xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.858557 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd3f4542-6055-4524-9e05-58b4c9a16e37" (UID: "dd3f4542-6055-4524-9e05-58b4c9a16e37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.899882 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf9xm\" (UniqueName: \"kubernetes.io/projected/dd3f4542-6055-4524-9e05-58b4c9a16e37-kube-api-access-nf9xm\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.899930 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.900022 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.009420 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j44f8"] Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.009709 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j44f8" podUID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" containerName="registry-server" containerID="cri-o://f8d7f73baa1032b6de41da56ddc6f1f2dec8f46b8ff8b6b1cc83c93dff54365f" gracePeriod=2 Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.462053 4804 generic.go:334] "Generic (PLEG): container finished" podID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" containerID="503926379bfc61a672b44215088d72cfe3108d43867dcdd3e3945371b4cab72f" exitCode=0 Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.462138 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfpnq" event={"ID":"af8f355f-84e5-49b0-83f4-b87ce7bb4015","Type":"ContainerDied","Data":"503926379bfc61a672b44215088d72cfe3108d43867dcdd3e3945371b4cab72f"} Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.465072 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9k56" event={"ID":"dd3f4542-6055-4524-9e05-58b4c9a16e37","Type":"ContainerDied","Data":"21bf4e05af6fa23bdde7a029ebf7c31d1a22cc2791c5a01af78f87549037e881"} Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.465115 4804 scope.go:117] "RemoveContainer" containerID="32268ca4bac2c1ba5e24c19291b99aeb8559e595975744c5d7c0ae06ab41c88b" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.465343 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.471134 4804 generic.go:334] "Generic (PLEG): container finished" podID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" containerID="f8d7f73baa1032b6de41da56ddc6f1f2dec8f46b8ff8b6b1cc83c93dff54365f" exitCode=0 Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.471178 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j44f8" event={"ID":"4627be0e-b7ba-4e46-820b-0ce1271ecacb","Type":"ContainerDied","Data":"f8d7f73baa1032b6de41da56ddc6f1f2dec8f46b8ff8b6b1cc83c93dff54365f"} Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.503802 4804 scope.go:117] "RemoveContainer" containerID="01f4c85dd4ed77fe0ae4fd3853fe066d8a2f72e40a5062bcaf2b92496b6c83fc" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.506530 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f9k56"] Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.509092 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f9k56"] Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.525953 4804 scope.go:117] "RemoveContainer" containerID="9da518d6a4ba94c30fc4e543aae3a6e806450f9d2bafc8157ce03ab22879d7ef" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.579632 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd3f4542-6055-4524-9e05-58b4c9a16e37" path="/var/lib/kubelet/pods/dd3f4542-6055-4524-9e05-58b4c9a16e37/volumes" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.692169 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.810468 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh9rr\" (UniqueName: \"kubernetes.io/projected/af8f355f-84e5-49b0-83f4-b87ce7bb4015-kube-api-access-hh9rr\") pod \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.810547 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-catalog-content\") pod \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.810594 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-utilities\") pod \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.812238 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-utilities" (OuterVolumeSpecName: "utilities") pod "af8f355f-84e5-49b0-83f4-b87ce7bb4015" (UID: "af8f355f-84e5-49b0-83f4-b87ce7bb4015"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.817702 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af8f355f-84e5-49b0-83f4-b87ce7bb4015-kube-api-access-hh9rr" (OuterVolumeSpecName: "kube-api-access-hh9rr") pod "af8f355f-84e5-49b0-83f4-b87ce7bb4015" (UID: "af8f355f-84e5-49b0-83f4-b87ce7bb4015"). InnerVolumeSpecName "kube-api-access-hh9rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.904941 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af8f355f-84e5-49b0-83f4-b87ce7bb4015" (UID: "af8f355f-84e5-49b0-83f4-b87ce7bb4015"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.911488 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh9rr\" (UniqueName: \"kubernetes.io/projected/af8f355f-84e5-49b0-83f4-b87ce7bb4015-kube-api-access-hh9rr\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.911568 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.911581 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.265422 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.417905 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-utilities\") pod \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.417996 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsj25\" (UniqueName: \"kubernetes.io/projected/4627be0e-b7ba-4e46-820b-0ce1271ecacb-kube-api-access-rsj25\") pod \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.418115 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-catalog-content\") pod \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.419277 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-utilities" (OuterVolumeSpecName: "utilities") pod "4627be0e-b7ba-4e46-820b-0ce1271ecacb" (UID: "4627be0e-b7ba-4e46-820b-0ce1271ecacb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.422157 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4627be0e-b7ba-4e46-820b-0ce1271ecacb-kube-api-access-rsj25" (OuterVolumeSpecName: "kube-api-access-rsj25") pod "4627be0e-b7ba-4e46-820b-0ce1271ecacb" (UID: "4627be0e-b7ba-4e46-820b-0ce1271ecacb"). InnerVolumeSpecName "kube-api-access-rsj25". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.448711 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4627be0e-b7ba-4e46-820b-0ce1271ecacb" (UID: "4627be0e-b7ba-4e46-820b-0ce1271ecacb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.478499 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.478488 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfpnq" event={"ID":"af8f355f-84e5-49b0-83f4-b87ce7bb4015","Type":"ContainerDied","Data":"be09cbde5111c6442fb7580667b29d0357b1495c50edff7352458e4b0ddab9db"} Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.478833 4804 scope.go:117] "RemoveContainer" containerID="503926379bfc61a672b44215088d72cfe3108d43867dcdd3e3945371b4cab72f" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.486441 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j44f8" event={"ID":"4627be0e-b7ba-4e46-820b-0ce1271ecacb","Type":"ContainerDied","Data":"c5910c70e84a82abe005c7000c40085a9ab0598685cbc3225b9df0cad35f66af"} Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.486476 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.512750 4804 scope.go:117] "RemoveContainer" containerID="6b673b46083a7a7e870939da823bebf898513e413a5e11d451d621999b90a4eb" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.512906 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dfpnq"] Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.514875 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dfpnq"] Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.521336 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsj25\" (UniqueName: \"kubernetes.io/projected/4627be0e-b7ba-4e46-820b-0ce1271ecacb-kube-api-access-rsj25\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.521365 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.521374 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.523182 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j44f8"] Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.528241 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j44f8"] Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.549839 4804 scope.go:117] "RemoveContainer" containerID="0a5fa9448a9b147d71180506aad70bb2187e4381cb523e0918b556f39008479f" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.570383 4804 scope.go:117] "RemoveContainer" containerID="f8d7f73baa1032b6de41da56ddc6f1f2dec8f46b8ff8b6b1cc83c93dff54365f" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.594843 4804 scope.go:117] "RemoveContainer" containerID="a14465d915fa294528de1e1a532d12f42a2b05c614c04dfaa5801608931bc3fa" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.610966 4804 scope.go:117] "RemoveContainer" containerID="fd63f395d9d2acc2a5229430110a217a86178b2333399d07e264a3b4cbc4fc4b" Feb 17 13:29:16 crc kubenswrapper[4804]: I0217 13:29:16.580226 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" path="/var/lib/kubelet/pods/4627be0e-b7ba-4e46-820b-0ce1271ecacb/volumes" Feb 17 13:29:16 crc kubenswrapper[4804]: I0217 13:29:16.581121 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" path="/var/lib/kubelet/pods/af8f355f-84e5-49b0-83f4-b87ce7bb4015/volumes" Feb 17 13:29:20 crc kubenswrapper[4804]: I0217 13:29:20.157636 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:29:21 crc kubenswrapper[4804]: I0217 13:29:21.534276 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:29:21 crc kubenswrapper[4804]: I0217 13:29:21.572272 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:29:22 crc kubenswrapper[4804]: I0217 13:29:22.410662 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c4fxk"] Feb 17 13:29:23 crc kubenswrapper[4804]: I0217 13:29:23.523924 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c4fxk" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerName="registry-server" containerID="cri-o://79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f" gracePeriod=2 Feb 17 13:29:23 crc kubenswrapper[4804]: I0217 13:29:23.884386 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.042833 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-utilities\") pod \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.043160 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-catalog-content\") pod \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.043258 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmjk4\" (UniqueName: \"kubernetes.io/projected/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-kube-api-access-jmjk4\") pod \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.043815 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-utilities" (OuterVolumeSpecName: "utilities") pod "3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" (UID: "3d715b9f-61c8-4851-a4b1-452f9f3ea8bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.049217 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-kube-api-access-jmjk4" (OuterVolumeSpecName: "kube-api-access-jmjk4") pod "3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" (UID: "3d715b9f-61c8-4851-a4b1-452f9f3ea8bd"). InnerVolumeSpecName "kube-api-access-jmjk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.144810 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmjk4\" (UniqueName: \"kubernetes.io/projected/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-kube-api-access-jmjk4\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.144847 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.169234 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" (UID: "3d715b9f-61c8-4851-a4b1-452f9f3ea8bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.245921 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.533289 4804 generic.go:334] "Generic (PLEG): container finished" podID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerID="79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f" exitCode=0 Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.533332 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4fxk" event={"ID":"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd","Type":"ContainerDied","Data":"79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f"} Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.533351 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.533370 4804 scope.go:117] "RemoveContainer" containerID="79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.533358 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4fxk" event={"ID":"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd","Type":"ContainerDied","Data":"7e1b2fb29927815e4957ff56f7ae370566373e378aef77389a1de5a8d2809eef"} Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.550693 4804 scope.go:117] "RemoveContainer" containerID="e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.564400 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c4fxk"] Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.567385 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c4fxk"] Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.580383 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" path="/var/lib/kubelet/pods/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd/volumes" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.586389 4804 scope.go:117] "RemoveContainer" containerID="0a2e7e7738570ae3eaeab55480ad74079b7d1d3eba43c78c570d3a883e196613" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.601313 4804 scope.go:117] "RemoveContainer" containerID="79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f" Feb 17 13:29:24 crc kubenswrapper[4804]: E0217 13:29:24.601745 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f\": container with ID starting with 79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f not found: ID does not exist" containerID="79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.601789 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f"} err="failed to get container status \"79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f\": rpc error: code = NotFound desc = could not find container \"79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f\": container with ID starting with 79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f not found: ID does not exist" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.601821 4804 scope.go:117] "RemoveContainer" containerID="e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef" Feb 17 13:29:24 crc kubenswrapper[4804]: E0217 13:29:24.602208 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef\": container with ID starting with e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef not found: ID does not exist" containerID="e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.602238 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef"} err="failed to get container status \"e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef\": rpc error: code = NotFound desc = could not find container \"e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef\": container with ID starting with e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef not found: ID does not exist" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.602253 4804 scope.go:117] "RemoveContainer" containerID="0a2e7e7738570ae3eaeab55480ad74079b7d1d3eba43c78c570d3a883e196613" Feb 17 13:29:24 crc kubenswrapper[4804]: E0217 13:29:24.602485 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a2e7e7738570ae3eaeab55480ad74079b7d1d3eba43c78c570d3a883e196613\": container with ID starting with 0a2e7e7738570ae3eaeab55480ad74079b7d1d3eba43c78c570d3a883e196613 not found: ID does not exist" containerID="0a2e7e7738570ae3eaeab55480ad74079b7d1d3eba43c78c570d3a883e196613" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.602509 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2e7e7738570ae3eaeab55480ad74079b7d1d3eba43c78c570d3a883e196613"} err="failed to get container status \"0a2e7e7738570ae3eaeab55480ad74079b7d1d3eba43c78c570d3a883e196613\": rpc error: code = NotFound desc = could not find container \"0a2e7e7738570ae3eaeab55480ad74079b7d1d3eba43c78c570d3a883e196613\": container with ID starting with 0a2e7e7738570ae3eaeab55480ad74079b7d1d3eba43c78c570d3a883e196613 not found: ID does not exist" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.883532 4804 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884061 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3f4542-6055-4524-9e05-58b4c9a16e37" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884081 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3f4542-6055-4524-9e05-58b4c9a16e37" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884101 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3f4542-6055-4524-9e05-58b4c9a16e37" containerName="extract-utilities" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884109 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3f4542-6055-4524-9e05-58b4c9a16e37" containerName="extract-utilities" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884120 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884127 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884139 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884144 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884151 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884157 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884166 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" containerName="extract-content" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884172 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" containerName="extract-content" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884179 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerName="extract-utilities" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884185 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerName="extract-utilities" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884238 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" containerName="extract-utilities" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884247 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" containerName="extract-utilities" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884260 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerName="extract-content" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884267 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerName="extract-content" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884279 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" containerName="extract-utilities" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884287 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" containerName="extract-utilities" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884297 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3f4542-6055-4524-9e05-58b4c9a16e37" containerName="extract-content" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884305 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3f4542-6055-4524-9e05-58b4c9a16e37" containerName="extract-content" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884313 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" containerName="extract-content" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884320 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" containerName="extract-content" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884331 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dda4da8-c5ea-4c8a-8443-d7e31eba95af" containerName="pruner" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884338 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dda4da8-c5ea-4c8a-8443-d7e31eba95af" containerName="pruner" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884435 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd3f4542-6055-4524-9e05-58b4c9a16e37" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884452 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884461 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884470 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884477 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dda4da8-c5ea-4c8a-8443-d7e31eba95af" containerName="pruner" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884760 4804 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884996 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094" gracePeriod=15 Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.885150 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.885390 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81" gracePeriod=15 Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.885561 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc" gracePeriod=15 Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.885603 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c" gracePeriod=15 Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.885670 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8" gracePeriod=15 Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887456 4804 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.887593 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887607 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.887616 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887623 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.887632 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887638 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.887651 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887656 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.887668 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887673 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.887682 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887688 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887772 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887782 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887790 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887797 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887806 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.887893 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887900 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887982 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.896816 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.896874 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.896915 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.896948 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.897008 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.897107 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.897135 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.897175 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.921589 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.000677 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.000721 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.000756 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.000778 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.000789 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.000754 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.000848 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.000844 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.000863 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.000810 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.000990 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.001051 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.001069 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.001116 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.001119 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.001154 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.217421 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: W0217 13:29:28.240682 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-f94861c9c3bcc3479971050ad3ef5bf6863ff8dfc0c2711385edc92b0df91739 WatchSource:0}: Error finding container f94861c9c3bcc3479971050ad3ef5bf6863ff8dfc0c2711385edc92b0df91739: Status 404 returned error can't find the container with id f94861c9c3bcc3479971050ad3ef5bf6863ff8dfc0c2711385edc92b0df91739 Feb 17 13:29:28 crc kubenswrapper[4804]: E0217 13:29:28.244695 4804 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.146:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18950bc4c85f2707 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 13:29:28.243332871 +0000 UTC m=+242.354752198,LastTimestamp:2026-02-17 13:29:28.243332871 +0000 UTC m=+242.354752198,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.568678 4804 generic.go:334] "Generic (PLEG): container finished" podID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" containerID="c268cbeacb8edca4cf6be1f9ade9d17e4f9a777b74947e1265bd5b8b02378689" exitCode=0 Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.568786 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18","Type":"ContainerDied","Data":"c268cbeacb8edca4cf6be1f9ade9d17e4f9a777b74947e1265bd5b8b02378689"} Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.570740 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.571168 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.571603 4804 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.583149 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.584411 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.584980 4804 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81" exitCode=0 Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.585011 4804 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c" exitCode=0 Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.585018 4804 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc" exitCode=0 Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.585027 4804 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8" exitCode=2 Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.587171 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.587787 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.588273 4804 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.589078 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf"} Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.589115 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f94861c9c3bcc3479971050ad3ef5bf6863ff8dfc0c2711385edc92b0df91739"} Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.589132 4804 scope.go:117] "RemoveContainer" containerID="f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533" Feb 17 13:29:29 crc kubenswrapper[4804]: E0217 13:29:29.438260 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:29:29Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:29:29Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:29:29Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:29:29Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:77c09c30acdeaaf95ab463052841d32404d264d7b46bead6207afe51848d25e3\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:b7b252dee7cfed79b278bcdec32ab88d70e98e83e6c0db9565a87d9e962cfecb\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1701350082},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:14398311b101163ddd1de78c093e161c5d3c9aac51a04e3d3d842fca6317ab0f\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:5a091792b99bf4dfaec25f4c8e29da579e2f452d48b924c8323a18accb7f3290\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1234637517},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:ad77d0ead8abca8b884fad3be18215dbe8b4f8f098053551e4a899298cf5c918\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:b5338e2ca87e0b47fec93f55559f0ed6b39eef3ed3b7f085a4f0b205ccb86a5d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1213306565},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:28df36269fc553eb1adba5566d6dfc258a1a74063c4cfe8b5bdd3f202591cf56\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:7fa59a55753e6c646b3b56a1a7080a5d70767fb964f1857c411fdf4e05ad4c71\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1201887930},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:29 crc kubenswrapper[4804]: E0217 13:29:29.439326 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:29 crc kubenswrapper[4804]: E0217 13:29:29.439599 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:29 crc kubenswrapper[4804]: E0217 13:29:29.439741 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:29 crc kubenswrapper[4804]: E0217 13:29:29.439880 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:29 crc kubenswrapper[4804]: E0217 13:29:29.439893 4804 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 13:29:29 crc kubenswrapper[4804]: I0217 13:29:29.596604 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 13:29:29 crc kubenswrapper[4804]: I0217 13:29:29.901697 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:29:29 crc kubenswrapper[4804]: I0217 13:29:29.902284 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:29 crc kubenswrapper[4804]: I0217 13:29:29.902565 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.023121 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-var-lock\") pod \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.023639 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kubelet-dir\") pod \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.023738 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kube-api-access\") pod \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.025073 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" (UID: "1c7ffc91-beb4-48c9-bd6a-3432eb40cb18"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.026080 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-var-lock" (OuterVolumeSpecName: "var-lock") pod "1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" (UID: "1c7ffc91-beb4-48c9-bd6a-3432eb40cb18"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.050806 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" (UID: "1c7ffc91-beb4-48c9-bd6a-3432eb40cb18"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.125388 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.125425 4804 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.125530 4804 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.271284 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.272432 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.273045 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.273645 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.274172 4804 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.429381 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.429502 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.429528 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.429557 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.429592 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.429707 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.430181 4804 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.430262 4804 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.430283 4804 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.583555 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.610824 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.612088 4804 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094" exitCode=0 Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.612235 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.612277 4804 scope.go:117] "RemoveContainer" containerID="a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.612940 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.613593 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.614280 4804 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.614918 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18","Type":"ContainerDied","Data":"c1dfeb9653f0eeda2384bf17db4b08547b2f5b249fa71fc9c90c6e17ca6502f1"} Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.614978 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1dfeb9653f0eeda2384bf17db4b08547b2f5b249fa71fc9c90c6e17ca6502f1" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.614981 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.619652 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.620116 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.620367 4804 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.623636 4804 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.624019 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.624547 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.637785 4804 scope.go:117] "RemoveContainer" containerID="5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.660024 4804 scope.go:117] "RemoveContainer" containerID="2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.680601 4804 scope.go:117] "RemoveContainer" containerID="b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.703377 4804 scope.go:117] "RemoveContainer" containerID="93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.724782 4804 scope.go:117] "RemoveContainer" containerID="3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.753240 4804 scope.go:117] "RemoveContainer" containerID="a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81" Feb 17 13:29:30 crc kubenswrapper[4804]: E0217 13:29:30.754002 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\": container with ID starting with a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81 not found: ID does not exist" containerID="a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.754055 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81"} err="failed to get container status \"a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\": rpc error: code = NotFound desc = could not find container \"a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\": container with ID starting with a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81 not found: ID does not exist" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.754098 4804 scope.go:117] "RemoveContainer" containerID="5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c" Feb 17 13:29:30 crc kubenswrapper[4804]: E0217 13:29:30.754711 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\": container with ID starting with 5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c not found: ID does not exist" containerID="5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.754744 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c"} err="failed to get container status \"5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\": rpc error: code = NotFound desc = could not find container \"5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\": container with ID starting with 5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c not found: ID does not exist" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.754772 4804 scope.go:117] "RemoveContainer" containerID="2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc" Feb 17 13:29:30 crc kubenswrapper[4804]: E0217 13:29:30.755121 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\": container with ID starting with 2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc not found: ID does not exist" containerID="2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.755167 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc"} err="failed to get container status \"2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\": rpc error: code = NotFound desc = could not find container \"2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\": container with ID starting with 2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc not found: ID does not exist" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.755218 4804 scope.go:117] "RemoveContainer" containerID="b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8" Feb 17 13:29:30 crc kubenswrapper[4804]: E0217 13:29:30.755622 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\": container with ID starting with b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8 not found: ID does not exist" containerID="b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.755660 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8"} err="failed to get container status \"b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\": rpc error: code = NotFound desc = could not find container \"b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\": container with ID starting with b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8 not found: ID does not exist" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.755679 4804 scope.go:117] "RemoveContainer" containerID="93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094" Feb 17 13:29:30 crc kubenswrapper[4804]: E0217 13:29:30.756010 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\": container with ID starting with 93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094 not found: ID does not exist" containerID="93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.756038 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094"} err="failed to get container status \"93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\": rpc error: code = NotFound desc = could not find container \"93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\": container with ID starting with 93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094 not found: ID does not exist" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.756058 4804 scope.go:117] "RemoveContainer" containerID="3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8" Feb 17 13:29:30 crc kubenswrapper[4804]: E0217 13:29:30.756374 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\": container with ID starting with 3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8 not found: ID does not exist" containerID="3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.756399 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8"} err="failed to get container status \"3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\": rpc error: code = NotFound desc = could not find container \"3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\": container with ID starting with 3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8 not found: ID does not exist" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.144831 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" containerName="oauth-openshift" containerID="cri-o://50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55" gracePeriod=15 Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.564831 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.566017 4804 status_manager.go:851] "Failed to get status for pod" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bstw9\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.566237 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.566536 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.641012 4804 generic.go:334] "Generic (PLEG): container finished" podID="81f879fe-7bd1-42d0-b026-80f901641a0b" containerID="50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55" exitCode=0 Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.641088 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.641100 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" event={"ID":"81f879fe-7bd1-42d0-b026-80f901641a0b","Type":"ContainerDied","Data":"50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55"} Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.641173 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" event={"ID":"81f879fe-7bd1-42d0-b026-80f901641a0b","Type":"ContainerDied","Data":"71eeeb2236ea109e4995422167d6b6185d64b78a4f394944d8af1d30f1eaa147"} Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.641223 4804 scope.go:117] "RemoveContainer" containerID="50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.642024 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.642547 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.642856 4804 status_manager.go:851] "Failed to get status for pod" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bstw9\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.666048 4804 scope.go:117] "RemoveContainer" containerID="50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55" Feb 17 13:29:33 crc kubenswrapper[4804]: E0217 13:29:33.666453 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55\": container with ID starting with 50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55 not found: ID does not exist" containerID="50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.666484 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55"} err="failed to get container status \"50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55\": rpc error: code = NotFound desc = could not find container \"50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55\": container with ID starting with 50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55 not found: ID does not exist" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.671946 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfbln\" (UniqueName: \"kubernetes.io/projected/81f879fe-7bd1-42d0-b026-80f901641a0b-kube-api-access-vfbln\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672008 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-session\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672029 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-cliconfig\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672045 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-policies\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672068 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-serving-cert\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672096 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-router-certs\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672120 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-dir\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672139 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-provider-selection\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672155 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-login\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672174 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-error\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672225 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-service-ca\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672243 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-trusted-ca-bundle\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672266 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-idp-0-file-data\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672295 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-ocp-branding-template\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.673636 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.675378 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.675579 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.676441 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.676586 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.679016 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.679487 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81f879fe-7bd1-42d0-b026-80f901641a0b-kube-api-access-vfbln" (OuterVolumeSpecName: "kube-api-access-vfbln") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "kube-api-access-vfbln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.679571 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.679840 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.680025 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.680313 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.681667 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.683150 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.685645 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.774845 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.774882 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.774897 4804 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.774911 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.774923 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.774943 4804 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.774956 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.774970 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.774983 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.774994 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.775005 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.775019 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.775031 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.775043 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfbln\" (UniqueName: \"kubernetes.io/projected/81f879fe-7bd1-42d0-b026-80f901641a0b-kube-api-access-vfbln\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.962824 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.963352 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.963983 4804 status_manager.go:851] "Failed to get status for pod" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bstw9\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:36 crc kubenswrapper[4804]: I0217 13:29:36.577072 4804 status_manager.go:851] "Failed to get status for pod" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bstw9\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:36 crc kubenswrapper[4804]: I0217 13:29:36.578070 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:36 crc kubenswrapper[4804]: I0217 13:29:36.578736 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:36 crc kubenswrapper[4804]: E0217 13:29:36.822115 4804 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.146:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18950bc4c85f2707 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 13:29:28.243332871 +0000 UTC m=+242.354752198,LastTimestamp:2026-02-17 13:29:28.243332871 +0000 UTC m=+242.354752198,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 13:29:38 crc kubenswrapper[4804]: E0217 13:29:38.105707 4804 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:38 crc kubenswrapper[4804]: E0217 13:29:38.107172 4804 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:38 crc kubenswrapper[4804]: E0217 13:29:38.107718 4804 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:38 crc kubenswrapper[4804]: E0217 13:29:38.108025 4804 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:38 crc kubenswrapper[4804]: E0217 13:29:38.108245 4804 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:38 crc kubenswrapper[4804]: I0217 13:29:38.108268 4804 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 17 13:29:38 crc kubenswrapper[4804]: E0217 13:29:38.108504 4804 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="200ms" Feb 17 13:29:38 crc kubenswrapper[4804]: E0217 13:29:38.309894 4804 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="400ms" Feb 17 13:29:38 crc kubenswrapper[4804]: E0217 13:29:38.710230 4804 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="800ms" Feb 17 13:29:39 crc kubenswrapper[4804]: E0217 13:29:39.511681 4804 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="1.6s" Feb 17 13:29:39 crc kubenswrapper[4804]: I0217 13:29:39.573868 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:39 crc kubenswrapper[4804]: I0217 13:29:39.575344 4804 status_manager.go:851] "Failed to get status for pod" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bstw9\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:39 crc kubenswrapper[4804]: I0217 13:29:39.577881 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:39 crc kubenswrapper[4804]: I0217 13:29:39.578733 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:39 crc kubenswrapper[4804]: I0217 13:29:39.597981 4804 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:39 crc kubenswrapper[4804]: I0217 13:29:39.598293 4804 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:39 crc kubenswrapper[4804]: E0217 13:29:39.598849 4804 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:39 crc kubenswrapper[4804]: I0217 13:29:39.599675 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:39 crc kubenswrapper[4804]: I0217 13:29:39.692657 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0b0854f3cb262fa049028621e272340489efe82cfa1fc6f2537c58dd46546101"} Feb 17 13:29:39 crc kubenswrapper[4804]: E0217 13:29:39.770553 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:29:39Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:29:39Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:29:39Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:29:39Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:77c09c30acdeaaf95ab463052841d32404d264d7b46bead6207afe51848d25e3\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:b7b252dee7cfed79b278bcdec32ab88d70e98e83e6c0db9565a87d9e962cfecb\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1701350082},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:14398311b101163ddd1de78c093e161c5d3c9aac51a04e3d3d842fca6317ab0f\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:5a091792b99bf4dfaec25f4c8e29da579e2f452d48b924c8323a18accb7f3290\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1234637517},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:ad77d0ead8abca8b884fad3be18215dbe8b4f8f098053551e4a899298cf5c918\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:b5338e2ca87e0b47fec93f55559f0ed6b39eef3ed3b7f085a4f0b205ccb86a5d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1213306565},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:28df36269fc553eb1adba5566d6dfc258a1a74063c4cfe8b5bdd3f202591cf56\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:7fa59a55753e6c646b3b56a1a7080a5d70767fb964f1857c411fdf4e05ad4c71\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1201887930},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:39 crc kubenswrapper[4804]: E0217 13:29:39.771289 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:39 crc kubenswrapper[4804]: E0217 13:29:39.772271 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:39 crc kubenswrapper[4804]: E0217 13:29:39.772548 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:39 crc kubenswrapper[4804]: E0217 13:29:39.772916 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:39 crc kubenswrapper[4804]: E0217 13:29:39.772945 4804 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 13:29:40 crc kubenswrapper[4804]: I0217 13:29:40.703160 4804 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="cbe7ae4cd093b968d7c7ee4362aff22a5455f5638fd4398319c0aff8fa79ea7a" exitCode=0 Feb 17 13:29:40 crc kubenswrapper[4804]: I0217 13:29:40.703538 4804 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:40 crc kubenswrapper[4804]: I0217 13:29:40.703569 4804 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:40 crc kubenswrapper[4804]: I0217 13:29:40.703289 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"cbe7ae4cd093b968d7c7ee4362aff22a5455f5638fd4398319c0aff8fa79ea7a"} Feb 17 13:29:40 crc kubenswrapper[4804]: E0217 13:29:40.704236 4804 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:40 crc kubenswrapper[4804]: I0217 13:29:40.705073 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:40 crc kubenswrapper[4804]: I0217 13:29:40.705641 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:40 crc kubenswrapper[4804]: I0217 13:29:40.706085 4804 status_manager.go:851] "Failed to get status for pod" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bstw9\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:41 crc kubenswrapper[4804]: I0217 13:29:41.722343 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"102e1e5974425fef155110e41c51470f9e6b807e0b092d863f31de5f50f21dc1"} Feb 17 13:29:41 crc kubenswrapper[4804]: I0217 13:29:41.722719 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6f50029b23d691f8caa47a0fed4b5fc863c7c5fa284179b0e693e264a1499732"} Feb 17 13:29:41 crc kubenswrapper[4804]: I0217 13:29:41.722736 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"332638fe7f4d285f3941d06f5c458e325092eefe1222a3dd152246b25f4b6cf5"} Feb 17 13:29:41 crc kubenswrapper[4804]: I0217 13:29:41.722748 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e9d83018121b03cf9cc210107a95e53990a1917656e8639c6978b83c786f2589"} Feb 17 13:29:42 crc kubenswrapper[4804]: I0217 13:29:42.639684 4804 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:54174->192.168.126.11:10257: read: connection reset by peer" start-of-body= Feb 17 13:29:42 crc kubenswrapper[4804]: I0217 13:29:42.639746 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:54174->192.168.126.11:10257: read: connection reset by peer" Feb 17 13:29:42 crc kubenswrapper[4804]: I0217 13:29:42.731678 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7b0cf48a61f4cdf55a6c1b632af4332f979390f1f9d77984648db8043cd05f9f"} Feb 17 13:29:42 crc kubenswrapper[4804]: I0217 13:29:42.732276 4804 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:42 crc kubenswrapper[4804]: I0217 13:29:42.732324 4804 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:42 crc kubenswrapper[4804]: I0217 13:29:42.734773 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 13:29:42 crc kubenswrapper[4804]: I0217 13:29:42.734850 4804 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d" exitCode=1 Feb 17 13:29:42 crc kubenswrapper[4804]: I0217 13:29:42.734905 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d"} Feb 17 13:29:42 crc kubenswrapper[4804]: I0217 13:29:42.735509 4804 scope.go:117] "RemoveContainer" containerID="c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d" Feb 17 13:29:43 crc kubenswrapper[4804]: I0217 13:29:43.746190 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 13:29:43 crc kubenswrapper[4804]: I0217 13:29:43.746756 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b52548660c4b8e92ad39e3e40e26c3218848efb2b6171a343cd6cc6914a3928c"} Feb 17 13:29:43 crc kubenswrapper[4804]: I0217 13:29:43.785474 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:29:44 crc kubenswrapper[4804]: I0217 13:29:44.600237 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:44 crc kubenswrapper[4804]: I0217 13:29:44.600332 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:44 crc kubenswrapper[4804]: I0217 13:29:44.610090 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:47 crc kubenswrapper[4804]: I0217 13:29:47.741842 4804 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:47 crc kubenswrapper[4804]: I0217 13:29:47.769132 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:47 crc kubenswrapper[4804]: I0217 13:29:47.769183 4804 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:47 crc kubenswrapper[4804]: I0217 13:29:47.769236 4804 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:47 crc kubenswrapper[4804]: I0217 13:29:47.775477 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:47 crc kubenswrapper[4804]: I0217 13:29:47.810908 4804 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ab993b1e-b5d6-4960-87de-575a7efa0fa6" Feb 17 13:29:48 crc kubenswrapper[4804]: I0217 13:29:48.773037 4804 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:48 crc kubenswrapper[4804]: I0217 13:29:48.773413 4804 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:48 crc kubenswrapper[4804]: I0217 13:29:48.777001 4804 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ab993b1e-b5d6-4960-87de-575a7efa0fa6" Feb 17 13:29:48 crc kubenswrapper[4804]: I0217 13:29:48.846641 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:29:48 crc kubenswrapper[4804]: I0217 13:29:48.846956 4804 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 17 13:29:48 crc kubenswrapper[4804]: I0217 13:29:48.846992 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 17 13:29:49 crc kubenswrapper[4804]: I0217 13:29:49.777840 4804 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:49 crc kubenswrapper[4804]: I0217 13:29:49.777882 4804 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:49 crc kubenswrapper[4804]: I0217 13:29:49.786869 4804 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ab993b1e-b5d6-4960-87de-575a7efa0fa6" Feb 17 13:29:57 crc kubenswrapper[4804]: I0217 13:29:57.378422 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 13:29:58 crc kubenswrapper[4804]: I0217 13:29:58.490355 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 13:29:58 crc kubenswrapper[4804]: I0217 13:29:58.791494 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 13:29:58 crc kubenswrapper[4804]: I0217 13:29:58.847585 4804 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 17 13:29:58 crc kubenswrapper[4804]: I0217 13:29:58.847980 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 17 13:29:59 crc kubenswrapper[4804]: I0217 13:29:59.849568 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 13:29:59 crc kubenswrapper[4804]: I0217 13:29:59.945284 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 13:30:00 crc kubenswrapper[4804]: I0217 13:30:00.108008 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 13:30:00 crc kubenswrapper[4804]: I0217 13:30:00.141701 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 13:30:00 crc kubenswrapper[4804]: I0217 13:30:00.174042 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 13:30:00 crc kubenswrapper[4804]: I0217 13:30:00.197077 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 13:30:00 crc kubenswrapper[4804]: I0217 13:30:00.342025 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 13:30:00 crc kubenswrapper[4804]: I0217 13:30:00.434701 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 13:30:00 crc kubenswrapper[4804]: I0217 13:30:00.512278 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 13:30:00 crc kubenswrapper[4804]: I0217 13:30:00.522924 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 13:30:00 crc kubenswrapper[4804]: I0217 13:30:00.642743 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 13:30:00 crc kubenswrapper[4804]: I0217 13:30:00.702516 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 13:30:00 crc kubenswrapper[4804]: I0217 13:30:00.907736 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.141110 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.179796 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.551801 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.597412 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.697888 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.754983 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.772441 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.772807 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.776892 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.894846 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.959927 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.961614 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.977177 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 13:30:02 crc kubenswrapper[4804]: I0217 13:30:02.010723 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 13:30:02 crc kubenswrapper[4804]: I0217 13:30:02.044109 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 13:30:02 crc kubenswrapper[4804]: I0217 13:30:02.096400 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 13:30:02 crc kubenswrapper[4804]: I0217 13:30:02.110864 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 13:30:02 crc kubenswrapper[4804]: I0217 13:30:02.357316 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 13:30:02 crc kubenswrapper[4804]: I0217 13:30:02.381656 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 13:30:02 crc kubenswrapper[4804]: I0217 13:30:02.399946 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 13:30:02 crc kubenswrapper[4804]: I0217 13:30:02.447085 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 13:30:02 crc kubenswrapper[4804]: I0217 13:30:02.484219 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 13:30:02 crc kubenswrapper[4804]: I0217 13:30:02.487466 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 13:30:02 crc kubenswrapper[4804]: I0217 13:30:02.684116 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.005190 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.009326 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.090376 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.130909 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.379932 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.442498 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.462113 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.483579 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.492300 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.570394 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.679714 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.839126 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.867156 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.875693 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.917696 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.932649 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.950487 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.956655 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.969078 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.025188 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.058427 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.071421 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.154907 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.418600 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.423451 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.428612 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.530844 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.634019 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.664173 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.678658 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.686270 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.697304 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.833952 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.927910 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.065023 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.072943 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.094598 4804 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.099057 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.168171 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.169760 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.192811 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.226716 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.258693 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.337202 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.548707 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.736271 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.891848 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.898107 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.065551 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.233271 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.257361 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.291986 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.314229 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.337618 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.351276 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.359033 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.451064 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.452974 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.457467 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.474151 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.519707 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.620523 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.650947 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.780272 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.824890 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.855760 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.040412 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.044490 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.059620 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.102838 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.139829 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.211668 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.370111 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.432642 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.454959 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.493943 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.588768 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.602888 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.606284 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.636847 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.653302 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.821182 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.828266 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.944790 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.994521 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.028491 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.029852 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.041075 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.057883 4804 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.058307 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=41.058285014 podStartE2EDuration="41.058285014s" podCreationTimestamp="2026-02-17 13:29:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:29:47.784780741 +0000 UTC m=+261.896200098" watchObservedRunningTime="2026-02-17 13:30:08.058285014 +0000 UTC m=+282.169704391" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.066948 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bstw9","openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.067050 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.075439 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.095566 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=21.095550119 podStartE2EDuration="21.095550119s" podCreationTimestamp="2026-02-17 13:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:30:08.09045004 +0000 UTC m=+282.201869387" watchObservedRunningTime="2026-02-17 13:30:08.095550119 +0000 UTC m=+282.206969466" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.185111 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.317681 4804 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.352513 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.369087 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.432696 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.479764 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.522617 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.526443 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.529473 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.551329 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6ffb55868c-gf9sh"] Feb 17 13:30:08 crc kubenswrapper[4804]: E0217 13:30:08.551572 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" containerName="oauth-openshift" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.551585 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" containerName="oauth-openshift" Feb 17 13:30:08 crc kubenswrapper[4804]: E0217 13:30:08.551609 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" containerName="installer" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.551617 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" containerName="installer" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.551729 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" containerName="installer" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.551743 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" containerName="oauth-openshift" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.552296 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.554562 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.554808 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.555128 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.555456 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.557257 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.557436 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.557719 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.557924 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.558796 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.560350 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.560482 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.561598 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.567062 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.567721 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.572128 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.585395 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" path="/var/lib/kubelet/pods/81f879fe-7bd1-42d0-b026-80f901641a0b/volumes" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.610652 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.614247 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.649344 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.691485 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.708617 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.713992 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714154 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-template-login\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714229 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-session\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714267 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-router-certs\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714307 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-audit-policies\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714367 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714401 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrxn5\" (UniqueName: \"kubernetes.io/projected/8dbf5bc9-5a1a-4946-823d-1da911581f59-kube-api-access-mrxn5\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714470 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714636 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714792 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-template-error\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714822 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714847 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714882 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8dbf5bc9-5a1a-4946-823d-1da911581f59-audit-dir\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714909 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-service-ca\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.775188 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.782470 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.785701 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816010 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816077 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrxn5\" (UniqueName: \"kubernetes.io/projected/8dbf5bc9-5a1a-4946-823d-1da911581f59-kube-api-access-mrxn5\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816135 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816177 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816259 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-template-error\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816293 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816327 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816375 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8dbf5bc9-5a1a-4946-823d-1da911581f59-audit-dir\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816411 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-service-ca\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816446 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816515 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-template-login\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816553 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-session\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816599 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-router-certs\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816638 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-audit-policies\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.817390 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8dbf5bc9-5a1a-4946-823d-1da911581f59-audit-dir\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.817873 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-audit-policies\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.818155 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-service-ca\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.818545 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.820552 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.825807 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.825840 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-template-error\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.826041 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-template-login\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.826631 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-session\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.827239 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-router-certs\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.828793 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.830781 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.837118 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.839361 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.847158 4804 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.847264 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.847323 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.848073 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrxn5\" (UniqueName: \"kubernetes.io/projected/8dbf5bc9-5a1a-4946-823d-1da911581f59-kube-api-access-mrxn5\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.849370 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"b52548660c4b8e92ad39e3e40e26c3218848efb2b6171a343cd6cc6914a3928c"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.850290 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://b52548660c4b8e92ad39e3e40e26c3218848efb2b6171a343cd6cc6914a3928c" gracePeriod=30 Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.873666 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.874839 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:08.890290 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:08.890847 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:08.956689 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:08.973471 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:08.995906 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.008504 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.055570 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.120769 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.121765 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.125871 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.219098 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.226138 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.336452 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.354952 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.402013 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.431039 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.571323 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.648278 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.771678 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.825786 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.874978 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.878529 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.905158 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.917299 4804 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.950834 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.027763 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.121835 4804 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.122051 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf" gracePeriod=5 Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.273168 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.282786 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.367282 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.531645 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.558307 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.593196 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.604629 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.722192 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.737260 4804 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.765553 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.867960 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.872938 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.877415 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 13:30:11 crc kubenswrapper[4804]: I0217 13:30:11.105947 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 13:30:11 crc kubenswrapper[4804]: I0217 13:30:11.184942 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 13:30:11 crc kubenswrapper[4804]: I0217 13:30:11.393216 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 13:30:11 crc kubenswrapper[4804]: I0217 13:30:11.600048 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 13:30:11 crc kubenswrapper[4804]: I0217 13:30:11.679319 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 13:30:11 crc kubenswrapper[4804]: I0217 13:30:11.829877 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 13:30:11 crc kubenswrapper[4804]: I0217 13:30:11.860382 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 13:30:11 crc kubenswrapper[4804]: I0217 13:30:11.872125 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6ffb55868c-gf9sh"] Feb 17 13:30:11 crc kubenswrapper[4804]: I0217 13:30:11.914682 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 13:30:11 crc kubenswrapper[4804]: I0217 13:30:11.936126 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.014318 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.066579 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.223057 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.322557 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6ffb55868c-gf9sh"] Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.351128 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.396905 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.594158 4804 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.601258 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.636255 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.707120 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.792147 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.850553 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.914140 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" event={"ID":"8dbf5bc9-5a1a-4946-823d-1da911581f59","Type":"ContainerStarted","Data":"6ed01f5efaf7d487a86c0e691215c53eaa541deaf5982f335c0080bcbfa88b4f"} Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.914190 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" event={"ID":"8dbf5bc9-5a1a-4946-823d-1da911581f59","Type":"ContainerStarted","Data":"f4723ca020f66f609ab934eb8c1e53d0c21b380d3f685f97a1d7387e1fbed0ba"} Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.914429 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.915329 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.919873 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.933979 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" podStartSLOduration=64.933964372 podStartE2EDuration="1m4.933964372s" podCreationTimestamp="2026-02-17 13:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:30:12.931755478 +0000 UTC m=+287.043174835" watchObservedRunningTime="2026-02-17 13:30:12.933964372 +0000 UTC m=+287.045383709" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.025888 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.063110 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.190707 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.248002 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.489332 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.507252 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.534812 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.551081 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.567510 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.654608 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.689734 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.844513 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.918843 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.977650 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 13:30:14 crc kubenswrapper[4804]: I0217 13:30:14.000366 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 13:30:14 crc kubenswrapper[4804]: I0217 13:30:14.160436 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 13:30:14 crc kubenswrapper[4804]: I0217 13:30:14.224902 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 13:30:14 crc kubenswrapper[4804]: I0217 13:30:14.231126 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 13:30:14 crc kubenswrapper[4804]: I0217 13:30:14.247662 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 13:30:14 crc kubenswrapper[4804]: I0217 13:30:14.380534 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 13:30:14 crc kubenswrapper[4804]: I0217 13:30:14.500681 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 13:30:14 crc kubenswrapper[4804]: I0217 13:30:14.543941 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 13:30:14 crc kubenswrapper[4804]: I0217 13:30:14.943980 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.266032 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.336960 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.360346 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.448706 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.802196 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.802916 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.904657 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.904796 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.904852 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.904971 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.905103 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.905155 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.905337 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.905394 4804 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.905351 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.905071 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.913491 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.931961 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.932043 4804 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf" exitCode=137 Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.932089 4804 scope.go:117] "RemoveContainer" containerID="2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.932220 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.959168 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.971035 4804 scope.go:117] "RemoveContainer" containerID="2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf" Feb 17 13:30:15 crc kubenswrapper[4804]: E0217 13:30:15.971609 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf\": container with ID starting with 2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf not found: ID does not exist" containerID="2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.971755 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf"} err="failed to get container status \"2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf\": rpc error: code = NotFound desc = could not find container \"2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf\": container with ID starting with 2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf not found: ID does not exist" Feb 17 13:30:16 crc kubenswrapper[4804]: I0217 13:30:16.005998 4804 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 13:30:16 crc kubenswrapper[4804]: I0217 13:30:16.006032 4804 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 13:30:16 crc kubenswrapper[4804]: I0217 13:30:16.006041 4804 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 17 13:30:16 crc kubenswrapper[4804]: I0217 13:30:16.006050 4804 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 13:30:16 crc kubenswrapper[4804]: I0217 13:30:16.218851 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 13:30:16 crc kubenswrapper[4804]: I0217 13:30:16.591926 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 17 13:30:16 crc kubenswrapper[4804]: I0217 13:30:16.592187 4804 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 17 13:30:16 crc kubenswrapper[4804]: I0217 13:30:16.609424 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 13:30:16 crc kubenswrapper[4804]: I0217 13:30:16.609485 4804 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="0b339e42-270d-4384-9d64-67edf62c1ad5" Feb 17 13:30:16 crc kubenswrapper[4804]: I0217 13:30:16.610692 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 13:30:16 crc kubenswrapper[4804]: I0217 13:30:16.610741 4804 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="0b339e42-270d-4384-9d64-67edf62c1ad5" Feb 17 13:30:26 crc kubenswrapper[4804]: I0217 13:30:26.373916 4804 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 17 13:30:39 crc kubenswrapper[4804]: I0217 13:30:39.167262 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 17 13:30:39 crc kubenswrapper[4804]: I0217 13:30:39.170947 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 13:30:39 crc kubenswrapper[4804]: I0217 13:30:39.171007 4804 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b52548660c4b8e92ad39e3e40e26c3218848efb2b6171a343cd6cc6914a3928c" exitCode=137 Feb 17 13:30:39 crc kubenswrapper[4804]: I0217 13:30:39.171053 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b52548660c4b8e92ad39e3e40e26c3218848efb2b6171a343cd6cc6914a3928c"} Feb 17 13:30:39 crc kubenswrapper[4804]: I0217 13:30:39.171093 4804 scope.go:117] "RemoveContainer" containerID="c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d" Feb 17 13:30:40 crc kubenswrapper[4804]: I0217 13:30:40.178666 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 17 13:30:40 crc kubenswrapper[4804]: I0217 13:30:40.180286 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"51d848e7fcac5ba8a752f5e5974f1297fda11e24720dd6b2c062443ccf88803d"} Feb 17 13:30:43 crc kubenswrapper[4804]: I0217 13:30:43.786526 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:30:48 crc kubenswrapper[4804]: I0217 13:30:48.846707 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:30:48 crc kubenswrapper[4804]: I0217 13:30:48.850309 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:30:49 crc kubenswrapper[4804]: I0217 13:30:49.233670 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.697652 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9"] Feb 17 13:31:00 crc kubenswrapper[4804]: E0217 13:31:00.698410 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.698426 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.698544 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.698865 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.702228 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.703621 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.708759 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx798\" (UniqueName: \"kubernetes.io/projected/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-kube-api-access-tx798\") pod \"collect-profiles-29522250-ddtb9\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.708791 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-secret-volume\") pod \"collect-profiles-29522250-ddtb9\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.708852 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-config-volume\") pod \"collect-profiles-29522250-ddtb9\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.729584 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9"] Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.764147 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k"] Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.764356 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" podUID="b710ce8a-f177-4c60-b8d5-bbf18bf38737" containerName="route-controller-manager" containerID="cri-o://1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1" gracePeriod=30 Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.787968 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66f58dbd5-dlsdn"] Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.788180 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" podUID="9631847b-1aa3-4bbd-95d4-cee45d896b11" containerName="controller-manager" containerID="cri-o://cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6" gracePeriod=30 Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.809717 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx798\" (UniqueName: \"kubernetes.io/projected/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-kube-api-access-tx798\") pod \"collect-profiles-29522250-ddtb9\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.809761 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-secret-volume\") pod \"collect-profiles-29522250-ddtb9\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.809810 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-config-volume\") pod \"collect-profiles-29522250-ddtb9\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.810689 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-config-volume\") pod \"collect-profiles-29522250-ddtb9\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.818347 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-secret-volume\") pod \"collect-profiles-29522250-ddtb9\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.849666 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx798\" (UniqueName: \"kubernetes.io/projected/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-kube-api-access-tx798\") pod \"collect-profiles-29522250-ddtb9\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.014795 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.152979 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.219715 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b710ce8a-f177-4c60-b8d5-bbf18bf38737-serving-cert\") pod \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.219794 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bxfh\" (UniqueName: \"kubernetes.io/projected/b710ce8a-f177-4c60-b8d5-bbf18bf38737-kube-api-access-7bxfh\") pod \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.219836 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-config\") pod \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.219856 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-client-ca\") pod \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.220975 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-client-ca" (OuterVolumeSpecName: "client-ca") pod "b710ce8a-f177-4c60-b8d5-bbf18bf38737" (UID: "b710ce8a-f177-4c60-b8d5-bbf18bf38737"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.221087 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-config" (OuterVolumeSpecName: "config") pod "b710ce8a-f177-4c60-b8d5-bbf18bf38737" (UID: "b710ce8a-f177-4c60-b8d5-bbf18bf38737"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.227358 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b710ce8a-f177-4c60-b8d5-bbf18bf38737-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b710ce8a-f177-4c60-b8d5-bbf18bf38737" (UID: "b710ce8a-f177-4c60-b8d5-bbf18bf38737"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.233129 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b710ce8a-f177-4c60-b8d5-bbf18bf38737-kube-api-access-7bxfh" (OuterVolumeSpecName: "kube-api-access-7bxfh") pod "b710ce8a-f177-4c60-b8d5-bbf18bf38737" (UID: "b710ce8a-f177-4c60-b8d5-bbf18bf38737"). InnerVolumeSpecName "kube-api-access-7bxfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.268729 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.302161 4804 generic.go:334] "Generic (PLEG): container finished" podID="9631847b-1aa3-4bbd-95d4-cee45d896b11" containerID="cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6" exitCode=0 Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.302235 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" event={"ID":"9631847b-1aa3-4bbd-95d4-cee45d896b11","Type":"ContainerDied","Data":"cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6"} Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.302251 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.302311 4804 scope.go:117] "RemoveContainer" containerID="cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.302295 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" event={"ID":"9631847b-1aa3-4bbd-95d4-cee45d896b11","Type":"ContainerDied","Data":"2e84da0c7befea7833b925b3ff40e336177c9ccd82633eca63155bf470709de5"} Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.305159 4804 generic.go:334] "Generic (PLEG): container finished" podID="b710ce8a-f177-4c60-b8d5-bbf18bf38737" containerID="1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1" exitCode=0 Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.305302 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" event={"ID":"b710ce8a-f177-4c60-b8d5-bbf18bf38737","Type":"ContainerDied","Data":"1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1"} Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.305331 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" event={"ID":"b710ce8a-f177-4c60-b8d5-bbf18bf38737","Type":"ContainerDied","Data":"558d5dd2eecf846742fd5b4dd243c32953c0fb248ec2faa9cde568927170e4d7"} Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.305585 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.322911 4804 scope.go:117] "RemoveContainer" containerID="cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6" Feb 17 13:31:01 crc kubenswrapper[4804]: E0217 13:31:01.323335 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6\": container with ID starting with cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6 not found: ID does not exist" containerID="cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.323362 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6"} err="failed to get container status \"cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6\": rpc error: code = NotFound desc = could not find container \"cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6\": container with ID starting with cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6 not found: ID does not exist" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.323383 4804 scope.go:117] "RemoveContainer" containerID="1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.324182 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b710ce8a-f177-4c60-b8d5-bbf18bf38737-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.324239 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bxfh\" (UniqueName: \"kubernetes.io/projected/b710ce8a-f177-4c60-b8d5-bbf18bf38737-kube-api-access-7bxfh\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.324256 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.324265 4804 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.340678 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k"] Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.345070 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k"] Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.345128 4804 scope.go:117] "RemoveContainer" containerID="1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1" Feb 17 13:31:01 crc kubenswrapper[4804]: E0217 13:31:01.345576 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1\": container with ID starting with 1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1 not found: ID does not exist" containerID="1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.345611 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1"} err="failed to get container status \"1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1\": rpc error: code = NotFound desc = could not find container \"1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1\": container with ID starting with 1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1 not found: ID does not exist" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.426273 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-proxy-ca-bundles\") pod \"9631847b-1aa3-4bbd-95d4-cee45d896b11\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.426360 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-config\") pod \"9631847b-1aa3-4bbd-95d4-cee45d896b11\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.426393 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cpwq\" (UniqueName: \"kubernetes.io/projected/9631847b-1aa3-4bbd-95d4-cee45d896b11-kube-api-access-7cpwq\") pod \"9631847b-1aa3-4bbd-95d4-cee45d896b11\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.426430 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-client-ca\") pod \"9631847b-1aa3-4bbd-95d4-cee45d896b11\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.426472 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9631847b-1aa3-4bbd-95d4-cee45d896b11-serving-cert\") pod \"9631847b-1aa3-4bbd-95d4-cee45d896b11\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.427067 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9631847b-1aa3-4bbd-95d4-cee45d896b11" (UID: "9631847b-1aa3-4bbd-95d4-cee45d896b11"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.427078 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-client-ca" (OuterVolumeSpecName: "client-ca") pod "9631847b-1aa3-4bbd-95d4-cee45d896b11" (UID: "9631847b-1aa3-4bbd-95d4-cee45d896b11"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.427176 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-config" (OuterVolumeSpecName: "config") pod "9631847b-1aa3-4bbd-95d4-cee45d896b11" (UID: "9631847b-1aa3-4bbd-95d4-cee45d896b11"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.429325 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9631847b-1aa3-4bbd-95d4-cee45d896b11-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9631847b-1aa3-4bbd-95d4-cee45d896b11" (UID: "9631847b-1aa3-4bbd-95d4-cee45d896b11"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.431804 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9631847b-1aa3-4bbd-95d4-cee45d896b11-kube-api-access-7cpwq" (OuterVolumeSpecName: "kube-api-access-7cpwq") pod "9631847b-1aa3-4bbd-95d4-cee45d896b11" (UID: "9631847b-1aa3-4bbd-95d4-cee45d896b11"). InnerVolumeSpecName "kube-api-access-7cpwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.527939 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9631847b-1aa3-4bbd-95d4-cee45d896b11-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.527976 4804 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.527988 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.527996 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cpwq\" (UniqueName: \"kubernetes.io/projected/9631847b-1aa3-4bbd-95d4-cee45d896b11-kube-api-access-7cpwq\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.528006 4804 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.596836 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9"] Feb 17 13:31:01 crc kubenswrapper[4804]: W0217 13:31:01.599303 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9f0ac4b_5b59_4ff9_92ba_54668fffef27.slice/crio-c1e49c136c4dbba235deb6639e43d6f45ce9dccf72695e9e4f2c1d14c70c1566 WatchSource:0}: Error finding container c1e49c136c4dbba235deb6639e43d6f45ce9dccf72695e9e4f2c1d14c70c1566: Status 404 returned error can't find the container with id c1e49c136c4dbba235deb6639e43d6f45ce9dccf72695e9e4f2c1d14c70c1566 Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.629337 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66f58dbd5-dlsdn"] Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.635386 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-66f58dbd5-dlsdn"] Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.315050 4804 generic.go:334] "Generic (PLEG): container finished" podID="f9f0ac4b-5b59-4ff9-92ba-54668fffef27" containerID="c63647c4f782e7514611e89775cb3101cab0f160b6675c0b2e9972791cd22306" exitCode=0 Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.315161 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" event={"ID":"f9f0ac4b-5b59-4ff9-92ba-54668fffef27","Type":"ContainerDied","Data":"c63647c4f782e7514611e89775cb3101cab0f160b6675c0b2e9972791cd22306"} Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.315400 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" event={"ID":"f9f0ac4b-5b59-4ff9-92ba-54668fffef27","Type":"ContainerStarted","Data":"c1e49c136c4dbba235deb6639e43d6f45ce9dccf72695e9e4f2c1d14c70c1566"} Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.550758 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n"] Feb 17 13:31:02 crc kubenswrapper[4804]: E0217 13:31:02.551189 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b710ce8a-f177-4c60-b8d5-bbf18bf38737" containerName="route-controller-manager" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.551262 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="b710ce8a-f177-4c60-b8d5-bbf18bf38737" containerName="route-controller-manager" Feb 17 13:31:02 crc kubenswrapper[4804]: E0217 13:31:02.551303 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9631847b-1aa3-4bbd-95d4-cee45d896b11" containerName="controller-manager" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.551325 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="9631847b-1aa3-4bbd-95d4-cee45d896b11" containerName="controller-manager" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.551583 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="9631847b-1aa3-4bbd-95d4-cee45d896b11" containerName="controller-manager" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.551630 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="b710ce8a-f177-4c60-b8d5-bbf18bf38737" containerName="route-controller-manager" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.552463 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.554455 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.554656 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.554746 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.555087 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.555534 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.555834 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.556990 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6df4db785d-ddhq7"] Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.558024 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.561133 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n"] Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.562479 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.562643 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.563463 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.568053 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6df4db785d-ddhq7"] Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.568180 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.568322 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.568497 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.568746 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.599054 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9631847b-1aa3-4bbd-95d4-cee45d896b11" path="/var/lib/kubelet/pods/9631847b-1aa3-4bbd-95d4-cee45d896b11/volumes" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.599672 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b710ce8a-f177-4c60-b8d5-bbf18bf38737" path="/var/lib/kubelet/pods/b710ce8a-f177-4c60-b8d5-bbf18bf38737/volumes" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.642877 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qc9n\" (UniqueName: \"kubernetes.io/projected/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-kube-api-access-4qc9n\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.642942 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-config\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.642968 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-serving-cert\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.643007 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e26901d5-e751-441b-9453-27e1f001a3a9-proxy-ca-bundles\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.643030 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-client-ca\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.643113 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e26901d5-e751-441b-9453-27e1f001a3a9-client-ca\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.643233 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmlns\" (UniqueName: \"kubernetes.io/projected/e26901d5-e751-441b-9453-27e1f001a3a9-kube-api-access-wmlns\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.643314 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e26901d5-e751-441b-9453-27e1f001a3a9-config\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.643359 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e26901d5-e751-441b-9453-27e1f001a3a9-serving-cert\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.743917 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e26901d5-e751-441b-9453-27e1f001a3a9-client-ca\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.743997 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmlns\" (UniqueName: \"kubernetes.io/projected/e26901d5-e751-441b-9453-27e1f001a3a9-kube-api-access-wmlns\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.744038 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e26901d5-e751-441b-9453-27e1f001a3a9-config\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.744065 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e26901d5-e751-441b-9453-27e1f001a3a9-serving-cert\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.744103 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qc9n\" (UniqueName: \"kubernetes.io/projected/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-kube-api-access-4qc9n\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.744142 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-config\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.744168 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-serving-cert\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.744238 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e26901d5-e751-441b-9453-27e1f001a3a9-proxy-ca-bundles\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.744271 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-client-ca\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.744965 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e26901d5-e751-441b-9453-27e1f001a3a9-client-ca\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.745061 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-client-ca\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.745179 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-config\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.745268 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e26901d5-e751-441b-9453-27e1f001a3a9-proxy-ca-bundles\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.746139 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e26901d5-e751-441b-9453-27e1f001a3a9-config\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.752009 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e26901d5-e751-441b-9453-27e1f001a3a9-serving-cert\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.754722 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-serving-cert\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.761720 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qc9n\" (UniqueName: \"kubernetes.io/projected/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-kube-api-access-4qc9n\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.764564 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmlns\" (UniqueName: \"kubernetes.io/projected/e26901d5-e751-441b-9453-27e1f001a3a9-kube-api-access-wmlns\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.914740 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.926383 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.115138 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6df4db785d-ddhq7"] Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.218708 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n"] Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.323611 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" event={"ID":"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180","Type":"ContainerStarted","Data":"5e6b651d76cfcfa9eea105d262e1df3063a9ffad4a7ecea8637f0a180b1ba235"} Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.326432 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" event={"ID":"e26901d5-e751-441b-9453-27e1f001a3a9","Type":"ContainerStarted","Data":"ad729f732fd481db4774293eb194a2fac94384e1fb71d6667ddff8d3803accf2"} Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.326485 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" event={"ID":"e26901d5-e751-441b-9453-27e1f001a3a9","Type":"ContainerStarted","Data":"cfb009c7a5ab1bf7984ceca99d0eef609ed38297bd6e80482b73b23defa4ef9a"} Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.354475 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" podStartSLOduration=3.35445174 podStartE2EDuration="3.35445174s" podCreationTimestamp="2026-02-17 13:31:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:31:03.340579632 +0000 UTC m=+337.451998989" watchObservedRunningTime="2026-02-17 13:31:03.35445174 +0000 UTC m=+337.465871067" Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.518490 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.559160 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-secret-volume\") pod \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.559277 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-config-volume\") pod \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.559318 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx798\" (UniqueName: \"kubernetes.io/projected/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-kube-api-access-tx798\") pod \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.560160 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-config-volume" (OuterVolumeSpecName: "config-volume") pod "f9f0ac4b-5b59-4ff9-92ba-54668fffef27" (UID: "f9f0ac4b-5b59-4ff9-92ba-54668fffef27"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.568808 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f9f0ac4b-5b59-4ff9-92ba-54668fffef27" (UID: "f9f0ac4b-5b59-4ff9-92ba-54668fffef27"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.572378 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-kube-api-access-tx798" (OuterVolumeSpecName: "kube-api-access-tx798") pod "f9f0ac4b-5b59-4ff9-92ba-54668fffef27" (UID: "f9f0ac4b-5b59-4ff9-92ba-54668fffef27"). InnerVolumeSpecName "kube-api-access-tx798". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.660743 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx798\" (UniqueName: \"kubernetes.io/projected/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-kube-api-access-tx798\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.660780 4804 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.660789 4804 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:04 crc kubenswrapper[4804]: I0217 13:31:04.333110 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:04 crc kubenswrapper[4804]: I0217 13:31:04.333106 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" event={"ID":"f9f0ac4b-5b59-4ff9-92ba-54668fffef27","Type":"ContainerDied","Data":"c1e49c136c4dbba235deb6639e43d6f45ce9dccf72695e9e4f2c1d14c70c1566"} Feb 17 13:31:04 crc kubenswrapper[4804]: I0217 13:31:04.333474 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1e49c136c4dbba235deb6639e43d6f45ce9dccf72695e9e4f2c1d14c70c1566" Feb 17 13:31:04 crc kubenswrapper[4804]: I0217 13:31:04.335365 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" event={"ID":"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180","Type":"ContainerStarted","Data":"6a48ec29f3734b19aebf029005af544ceafafc0ba49ac1f3e29a6dbbe82c4dbd"} Feb 17 13:31:04 crc kubenswrapper[4804]: I0217 13:31:04.335581 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:04 crc kubenswrapper[4804]: I0217 13:31:04.335734 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:04 crc kubenswrapper[4804]: I0217 13:31:04.343773 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:04 crc kubenswrapper[4804]: I0217 13:31:04.346754 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:04 crc kubenswrapper[4804]: I0217 13:31:04.354513 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" podStartSLOduration=4.354498187 podStartE2EDuration="4.354498187s" podCreationTimestamp="2026-02-17 13:31:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:31:04.35228031 +0000 UTC m=+338.463699647" watchObservedRunningTime="2026-02-17 13:31:04.354498187 +0000 UTC m=+338.465917524" Feb 17 13:31:25 crc kubenswrapper[4804]: I0217 13:31:25.835396 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:31:25 crc kubenswrapper[4804]: I0217 13:31:25.836089 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.578034 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tzg5s"] Feb 17 13:31:45 crc kubenswrapper[4804]: E0217 13:31:45.578792 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f0ac4b-5b59-4ff9-92ba-54668fffef27" containerName="collect-profiles" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.578806 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f0ac4b-5b59-4ff9-92ba-54668fffef27" containerName="collect-profiles" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.578901 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9f0ac4b-5b59-4ff9-92ba-54668fffef27" containerName="collect-profiles" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.579293 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.599801 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tzg5s"] Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.608895 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zdcb\" (UniqueName: \"kubernetes.io/projected/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-kube-api-access-9zdcb\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.608970 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-bound-sa-token\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.609013 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.609056 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-trusted-ca\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.609108 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.609185 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.609242 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-registry-tls\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.609281 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-registry-certificates\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.620107 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hpw7w"] Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.623615 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hpw7w" podUID="cbda9f29-b199-4a42-8757-f5ecc90f0437" containerName="registry-server" containerID="cri-o://0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77" gracePeriod=30 Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.624576 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-54w49"] Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.624764 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-54w49" podUID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" containerName="registry-server" containerID="cri-o://b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd" gracePeriod=30 Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.642933 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6k2g8"] Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.643240 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" podUID="2ce6eded-da13-4bb7-a87d-71b87d0e7f06" containerName="marketplace-operator" containerID="cri-o://249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d" gracePeriod=30 Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.648067 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fvtl6"] Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.648312 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fvtl6" podUID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" containerName="registry-server" containerID="cri-o://9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b" gracePeriod=30 Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.654310 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xf58f"] Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.654795 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xf58f" podUID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" containerName="registry-server" containerID="cri-o://6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240" gracePeriod=30 Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.668317 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.677408 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-26cwx"] Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.677990 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.689432 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-26cwx"] Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.710250 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zdcb\" (UniqueName: \"kubernetes.io/projected/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-kube-api-access-9zdcb\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.710291 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-bound-sa-token\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.710311 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.710345 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-trusted-ca\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.710360 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.710379 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-registry-tls\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.710398 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-registry-certificates\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.711762 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-registry-certificates\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.713767 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.714651 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-trusted-ca\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.718348 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.718782 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-registry-tls\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.728680 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-bound-sa-token\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.728765 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zdcb\" (UniqueName: \"kubernetes.io/projected/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-kube-api-access-9zdcb\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.812234 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqn79\" (UniqueName: \"kubernetes.io/projected/78a56ea9-6641-4d2d-8471-b40e5f2cf7e5-kube-api-access-qqn79\") pod \"marketplace-operator-79b997595-26cwx\" (UID: \"78a56ea9-6641-4d2d-8471-b40e5f2cf7e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.812641 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78a56ea9-6641-4d2d-8471-b40e5f2cf7e5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-26cwx\" (UID: \"78a56ea9-6641-4d2d-8471-b40e5f2cf7e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.816627 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78a56ea9-6641-4d2d-8471-b40e5f2cf7e5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-26cwx\" (UID: \"78a56ea9-6641-4d2d-8471-b40e5f2cf7e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.898343 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.917408 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78a56ea9-6641-4d2d-8471-b40e5f2cf7e5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-26cwx\" (UID: \"78a56ea9-6641-4d2d-8471-b40e5f2cf7e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.917496 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqn79\" (UniqueName: \"kubernetes.io/projected/78a56ea9-6641-4d2d-8471-b40e5f2cf7e5-kube-api-access-qqn79\") pod \"marketplace-operator-79b997595-26cwx\" (UID: \"78a56ea9-6641-4d2d-8471-b40e5f2cf7e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.917516 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78a56ea9-6641-4d2d-8471-b40e5f2cf7e5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-26cwx\" (UID: \"78a56ea9-6641-4d2d-8471-b40e5f2cf7e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.919545 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78a56ea9-6641-4d2d-8471-b40e5f2cf7e5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-26cwx\" (UID: \"78a56ea9-6641-4d2d-8471-b40e5f2cf7e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.920967 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78a56ea9-6641-4d2d-8471-b40e5f2cf7e5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-26cwx\" (UID: \"78a56ea9-6641-4d2d-8471-b40e5f2cf7e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.936262 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqn79\" (UniqueName: \"kubernetes.io/projected/78a56ea9-6641-4d2d-8471-b40e5f2cf7e5-kube-api-access-qqn79\") pod \"marketplace-operator-79b997595-26cwx\" (UID: \"78a56ea9-6641-4d2d-8471-b40e5f2cf7e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.992994 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.110734 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.122005 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.129855 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.139070 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.222707 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdxzj\" (UniqueName: \"kubernetes.io/projected/6a10f4e7-7906-43aa-98fb-e709a71a55d2-kube-api-access-zdxzj\") pod \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.222811 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-utilities\") pod \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.222887 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-catalog-content\") pod \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.223997 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-utilities" (OuterVolumeSpecName: "utilities") pod "6a10f4e7-7906-43aa-98fb-e709a71a55d2" (UID: "6a10f4e7-7906-43aa-98fb-e709a71a55d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.233361 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a10f4e7-7906-43aa-98fb-e709a71a55d2-kube-api-access-zdxzj" (OuterVolumeSpecName: "kube-api-access-zdxzj") pod "6a10f4e7-7906-43aa-98fb-e709a71a55d2" (UID: "6a10f4e7-7906-43aa-98fb-e709a71a55d2"). InnerVolumeSpecName "kube-api-access-zdxzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.249321 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a10f4e7-7906-43aa-98fb-e709a71a55d2" (UID: "6a10f4e7-7906-43aa-98fb-e709a71a55d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324057 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-catalog-content\") pod \"cbda9f29-b199-4a42-8757-f5ecc90f0437\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324158 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-catalog-content\") pod \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324312 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-trusted-ca\") pod \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324431 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4gxw\" (UniqueName: \"kubernetes.io/projected/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-kube-api-access-x4gxw\") pod \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324508 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6cwx\" (UniqueName: \"kubernetes.io/projected/cbda9f29-b199-4a42-8757-f5ecc90f0437-kube-api-access-g6cwx\") pod \"cbda9f29-b199-4a42-8757-f5ecc90f0437\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324538 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-utilities\") pod \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324562 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-utilities\") pod \"cbda9f29-b199-4a42-8757-f5ecc90f0437\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324600 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm9gj\" (UniqueName: \"kubernetes.io/projected/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-kube-api-access-pm9gj\") pod \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324649 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-operator-metrics\") pod \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324953 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324971 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdxzj\" (UniqueName: \"kubernetes.io/projected/6a10f4e7-7906-43aa-98fb-e709a71a55d2-kube-api-access-zdxzj\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324985 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.326933 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "2ce6eded-da13-4bb7-a87d-71b87d0e7f06" (UID: "2ce6eded-da13-4bb7-a87d-71b87d0e7f06"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.328134 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-utilities" (OuterVolumeSpecName: "utilities") pod "cbda9f29-b199-4a42-8757-f5ecc90f0437" (UID: "cbda9f29-b199-4a42-8757-f5ecc90f0437"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.328254 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-utilities" (OuterVolumeSpecName: "utilities") pod "4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" (UID: "4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.331931 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "2ce6eded-da13-4bb7-a87d-71b87d0e7f06" (UID: "2ce6eded-da13-4bb7-a87d-71b87d0e7f06"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.332060 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbda9f29-b199-4a42-8757-f5ecc90f0437-kube-api-access-g6cwx" (OuterVolumeSpecName: "kube-api-access-g6cwx") pod "cbda9f29-b199-4a42-8757-f5ecc90f0437" (UID: "cbda9f29-b199-4a42-8757-f5ecc90f0437"). InnerVolumeSpecName "kube-api-access-g6cwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.332243 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-kube-api-access-x4gxw" (OuterVolumeSpecName: "kube-api-access-x4gxw") pod "2ce6eded-da13-4bb7-a87d-71b87d0e7f06" (UID: "2ce6eded-da13-4bb7-a87d-71b87d0e7f06"). InnerVolumeSpecName "kube-api-access-x4gxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.333018 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-kube-api-access-pm9gj" (OuterVolumeSpecName: "kube-api-access-pm9gj") pod "4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" (UID: "4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5"). InnerVolumeSpecName "kube-api-access-pm9gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.363726 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tzg5s"] Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.405772 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cbda9f29-b199-4a42-8757-f5ecc90f0437" (UID: "cbda9f29-b199-4a42-8757-f5ecc90f0437"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.426284 4804 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.426326 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.426350 4804 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.426360 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4gxw\" (UniqueName: \"kubernetes.io/projected/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-kube-api-access-x4gxw\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.426370 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6cwx\" (UniqueName: \"kubernetes.io/projected/cbda9f29-b199-4a42-8757-f5ecc90f0437-kube-api-access-g6cwx\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.426380 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.426388 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.426396 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm9gj\" (UniqueName: \"kubernetes.io/projected/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-kube-api-access-pm9gj\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.458914 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-26cwx"] Feb 17 13:31:46 crc kubenswrapper[4804]: W0217 13:31:46.472054 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78a56ea9_6641_4d2d_8471_b40e5f2cf7e5.slice/crio-588e9c2b27f2947186cfe36096e284468ba847f94f45613c4ddfd2ff4b3a556d WatchSource:0}: Error finding container 588e9c2b27f2947186cfe36096e284468ba847f94f45613c4ddfd2ff4b3a556d: Status 404 returned error can't find the container with id 588e9c2b27f2947186cfe36096e284468ba847f94f45613c4ddfd2ff4b3a556d Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.476487 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" (UID: "4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.488393 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54w49" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.527081 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.580215 4804 generic.go:334] "Generic (PLEG): container finished" podID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" containerID="b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd" exitCode=0 Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.580312 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54w49" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.584925 4804 generic.go:334] "Generic (PLEG): container finished" podID="2ce6eded-da13-4bb7-a87d-71b87d0e7f06" containerID="249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d" exitCode=0 Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.585013 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.588403 4804 generic.go:334] "Generic (PLEG): container finished" podID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" containerID="9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b" exitCode=0 Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.588572 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.591383 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.596152 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54w49" event={"ID":"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df","Type":"ContainerDied","Data":"b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.596286 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54w49" event={"ID":"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df","Type":"ContainerDied","Data":"14bd0e0c6146aca8722f654770d91415f769ddfe462bd310b48fc23e91722dce"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.596378 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" event={"ID":"556a721b-bf87-43d3-9d93-fabcb7f8f1b0","Type":"ContainerStarted","Data":"1e3905e44f97c52e95e55a0e174cf9f2ec9e9413b7bb547239857c4e21e540c5"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.596458 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" event={"ID":"556a721b-bf87-43d3-9d93-fabcb7f8f1b0","Type":"ContainerStarted","Data":"dd70416b24d05181ecf15f4993d0ecac08d8523b5d8892f161f79eac2cb31ba9"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.596539 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.596629 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" event={"ID":"2ce6eded-da13-4bb7-a87d-71b87d0e7f06","Type":"ContainerDied","Data":"249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.596713 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" event={"ID":"2ce6eded-da13-4bb7-a87d-71b87d0e7f06","Type":"ContainerDied","Data":"8d3bbbb9c8ddaebadf3050ba63a4409fb724b92775f2af121beab0c80c2020a4"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.596796 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvtl6" event={"ID":"6a10f4e7-7906-43aa-98fb-e709a71a55d2","Type":"ContainerDied","Data":"9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.596881 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvtl6" event={"ID":"6a10f4e7-7906-43aa-98fb-e709a71a55d2","Type":"ContainerDied","Data":"122644669fc551cce79300f93153f1ee66ee7078e3af8dcd19bd62ec42ba0f74"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.596978 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" event={"ID":"78a56ea9-6641-4d2d-8471-b40e5f2cf7e5","Type":"ContainerStarted","Data":"f063158903793c873968f2a56861ba5637643358caadd6057e031c9e3fa7390d"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.597074 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" event={"ID":"78a56ea9-6641-4d2d-8471-b40e5f2cf7e5","Type":"ContainerStarted","Data":"588e9c2b27f2947186cfe36096e284468ba847f94f45613c4ddfd2ff4b3a556d"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.597163 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xf58f" event={"ID":"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5","Type":"ContainerDied","Data":"6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.594022 4804 generic.go:334] "Generic (PLEG): container finished" podID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" containerID="6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240" exitCode=0 Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.592500 4804 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-26cwx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.65:8080/healthz\": dial tcp 10.217.0.65:8080: connect: connection refused" start-of-body= Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.597467 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xf58f" event={"ID":"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5","Type":"ContainerDied","Data":"6c2639b1b465093d91b07ae1fd7b695d64615f297ec3d0a8c5e28adb5bb00161"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.597219 4804 scope.go:117] "RemoveContainer" containerID="b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.594122 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.597702 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" podUID="78a56ea9-6641-4d2d-8471-b40e5f2cf7e5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.65:8080/healthz\": dial tcp 10.217.0.65:8080: connect: connection refused" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.600169 4804 generic.go:334] "Generic (PLEG): container finished" podID="cbda9f29-b199-4a42-8757-f5ecc90f0437" containerID="0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77" exitCode=0 Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.600228 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpw7w" event={"ID":"cbda9f29-b199-4a42-8757-f5ecc90f0437","Type":"ContainerDied","Data":"0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.600253 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpw7w" event={"ID":"cbda9f29-b199-4a42-8757-f5ecc90f0437","Type":"ContainerDied","Data":"f8fddc3c1f1b98532bbecd6c7da5c2a2368e8ed8a3bd8f6f7983638879bf50a9"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.600327 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.616817 4804 scope.go:117] "RemoveContainer" containerID="b18e486af5e04ed68c256a5f17c4e5bc48b129f17f0b27df8fa47ac15336cfef" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.628176 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-catalog-content\") pod \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.628440 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmjlm\" (UniqueName: \"kubernetes.io/projected/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-kube-api-access-rmjlm\") pod \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.628623 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-utilities\") pod \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.630735 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-utilities" (OuterVolumeSpecName: "utilities") pod "5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" (UID: "5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.649365 4804 scope.go:117] "RemoveContainer" containerID="631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.652449 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-kube-api-access-rmjlm" (OuterVolumeSpecName: "kube-api-access-rmjlm") pod "5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" (UID: "5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df"). InnerVolumeSpecName "kube-api-access-rmjlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.672773 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" podStartSLOduration=1.672751679 podStartE2EDuration="1.672751679s" podCreationTimestamp="2026-02-17 13:31:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:31:46.672451351 +0000 UTC m=+380.783870688" watchObservedRunningTime="2026-02-17 13:31:46.672751679 +0000 UTC m=+380.784171016" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.700404 4804 scope.go:117] "RemoveContainer" containerID="b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.703754 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd\": container with ID starting with b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd not found: ID does not exist" containerID="b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.703817 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd"} err="failed to get container status \"b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd\": rpc error: code = NotFound desc = could not find container \"b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd\": container with ID starting with b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.703852 4804 scope.go:117] "RemoveContainer" containerID="b18e486af5e04ed68c256a5f17c4e5bc48b129f17f0b27df8fa47ac15336cfef" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.706587 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b18e486af5e04ed68c256a5f17c4e5bc48b129f17f0b27df8fa47ac15336cfef\": container with ID starting with b18e486af5e04ed68c256a5f17c4e5bc48b129f17f0b27df8fa47ac15336cfef not found: ID does not exist" containerID="b18e486af5e04ed68c256a5f17c4e5bc48b129f17f0b27df8fa47ac15336cfef" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.706623 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b18e486af5e04ed68c256a5f17c4e5bc48b129f17f0b27df8fa47ac15336cfef"} err="failed to get container status \"b18e486af5e04ed68c256a5f17c4e5bc48b129f17f0b27df8fa47ac15336cfef\": rpc error: code = NotFound desc = could not find container \"b18e486af5e04ed68c256a5f17c4e5bc48b129f17f0b27df8fa47ac15336cfef\": container with ID starting with b18e486af5e04ed68c256a5f17c4e5bc48b129f17f0b27df8fa47ac15336cfef not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.706830 4804 scope.go:117] "RemoveContainer" containerID="631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.708577 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c\": container with ID starting with 631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c not found: ID does not exist" containerID="631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.708748 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c"} err="failed to get container status \"631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c\": rpc error: code = NotFound desc = could not find container \"631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c\": container with ID starting with 631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.708903 4804 scope.go:117] "RemoveContainer" containerID="249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.713325 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" podStartSLOduration=1.7133047860000001 podStartE2EDuration="1.713304786s" podCreationTimestamp="2026-02-17 13:31:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:31:46.705383442 +0000 UTC m=+380.816802809" watchObservedRunningTime="2026-02-17 13:31:46.713304786 +0000 UTC m=+380.824724133" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.721049 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fvtl6"] Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.731231 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.731264 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmjlm\" (UniqueName: \"kubernetes.io/projected/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-kube-api-access-rmjlm\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.735325 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fvtl6"] Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.740123 4804 scope.go:117] "RemoveContainer" containerID="249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.740806 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d\": container with ID starting with 249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d not found: ID does not exist" containerID="249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.740833 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d"} err="failed to get container status \"249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d\": rpc error: code = NotFound desc = could not find container \"249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d\": container with ID starting with 249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.740858 4804 scope.go:117] "RemoveContainer" containerID="9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.746364 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6k2g8"] Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.754193 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" (UID: "5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.757809 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6k2g8"] Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.761117 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hpw7w"] Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.766647 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hpw7w"] Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.771854 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xf58f"] Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.772935 4804 scope.go:117] "RemoveContainer" containerID="75097ed47129979b31be1fb063b6f429a2375e95db196451ba1b30153d8fc201" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.779112 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xf58f"] Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.791109 4804 scope.go:117] "RemoveContainer" containerID="eafbbbbbea590f0ceb80b9fa9d1c65902281f33937ef810444499dad62ff97c5" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.810491 4804 scope.go:117] "RemoveContainer" containerID="9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.810859 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b\": container with ID starting with 9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b not found: ID does not exist" containerID="9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.810912 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b"} err="failed to get container status \"9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b\": rpc error: code = NotFound desc = could not find container \"9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b\": container with ID starting with 9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.810943 4804 scope.go:117] "RemoveContainer" containerID="75097ed47129979b31be1fb063b6f429a2375e95db196451ba1b30153d8fc201" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.811558 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75097ed47129979b31be1fb063b6f429a2375e95db196451ba1b30153d8fc201\": container with ID starting with 75097ed47129979b31be1fb063b6f429a2375e95db196451ba1b30153d8fc201 not found: ID does not exist" containerID="75097ed47129979b31be1fb063b6f429a2375e95db196451ba1b30153d8fc201" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.811584 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75097ed47129979b31be1fb063b6f429a2375e95db196451ba1b30153d8fc201"} err="failed to get container status \"75097ed47129979b31be1fb063b6f429a2375e95db196451ba1b30153d8fc201\": rpc error: code = NotFound desc = could not find container \"75097ed47129979b31be1fb063b6f429a2375e95db196451ba1b30153d8fc201\": container with ID starting with 75097ed47129979b31be1fb063b6f429a2375e95db196451ba1b30153d8fc201 not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.811600 4804 scope.go:117] "RemoveContainer" containerID="eafbbbbbea590f0ceb80b9fa9d1c65902281f33937ef810444499dad62ff97c5" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.812360 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eafbbbbbea590f0ceb80b9fa9d1c65902281f33937ef810444499dad62ff97c5\": container with ID starting with eafbbbbbea590f0ceb80b9fa9d1c65902281f33937ef810444499dad62ff97c5 not found: ID does not exist" containerID="eafbbbbbea590f0ceb80b9fa9d1c65902281f33937ef810444499dad62ff97c5" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.812386 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eafbbbbbea590f0ceb80b9fa9d1c65902281f33937ef810444499dad62ff97c5"} err="failed to get container status \"eafbbbbbea590f0ceb80b9fa9d1c65902281f33937ef810444499dad62ff97c5\": rpc error: code = NotFound desc = could not find container \"eafbbbbbea590f0ceb80b9fa9d1c65902281f33937ef810444499dad62ff97c5\": container with ID starting with eafbbbbbea590f0ceb80b9fa9d1c65902281f33937ef810444499dad62ff97c5 not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.812399 4804 scope.go:117] "RemoveContainer" containerID="6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.829849 4804 scope.go:117] "RemoveContainer" containerID="de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.832841 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.848400 4804 scope.go:117] "RemoveContainer" containerID="f15bcf4d827a1a1b8fffb482f5c62a175b13f562fa7361a7f56418b0dbd2dd76" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.864473 4804 scope.go:117] "RemoveContainer" containerID="6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.864877 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240\": container with ID starting with 6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240 not found: ID does not exist" containerID="6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.864942 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240"} err="failed to get container status \"6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240\": rpc error: code = NotFound desc = could not find container \"6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240\": container with ID starting with 6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240 not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.864987 4804 scope.go:117] "RemoveContainer" containerID="de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.865428 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e\": container with ID starting with de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e not found: ID does not exist" containerID="de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.865458 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e"} err="failed to get container status \"de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e\": rpc error: code = NotFound desc = could not find container \"de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e\": container with ID starting with de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.865482 4804 scope.go:117] "RemoveContainer" containerID="f15bcf4d827a1a1b8fffb482f5c62a175b13f562fa7361a7f56418b0dbd2dd76" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.865796 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f15bcf4d827a1a1b8fffb482f5c62a175b13f562fa7361a7f56418b0dbd2dd76\": container with ID starting with f15bcf4d827a1a1b8fffb482f5c62a175b13f562fa7361a7f56418b0dbd2dd76 not found: ID does not exist" containerID="f15bcf4d827a1a1b8fffb482f5c62a175b13f562fa7361a7f56418b0dbd2dd76" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.865814 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f15bcf4d827a1a1b8fffb482f5c62a175b13f562fa7361a7f56418b0dbd2dd76"} err="failed to get container status \"f15bcf4d827a1a1b8fffb482f5c62a175b13f562fa7361a7f56418b0dbd2dd76\": rpc error: code = NotFound desc = could not find container \"f15bcf4d827a1a1b8fffb482f5c62a175b13f562fa7361a7f56418b0dbd2dd76\": container with ID starting with f15bcf4d827a1a1b8fffb482f5c62a175b13f562fa7361a7f56418b0dbd2dd76 not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.865827 4804 scope.go:117] "RemoveContainer" containerID="0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.878218 4804 scope.go:117] "RemoveContainer" containerID="3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.892979 4804 scope.go:117] "RemoveContainer" containerID="88fbb69755113b5be170561fa4a81a646f0de4078219f8a5bd52d347e075ff44" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.910947 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-54w49"] Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.912729 4804 scope.go:117] "RemoveContainer" containerID="0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.913282 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77\": container with ID starting with 0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77 not found: ID does not exist" containerID="0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.913333 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77"} err="failed to get container status \"0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77\": rpc error: code = NotFound desc = could not find container \"0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77\": container with ID starting with 0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77 not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.913367 4804 scope.go:117] "RemoveContainer" containerID="3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.913657 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19\": container with ID starting with 3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19 not found: ID does not exist" containerID="3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.913691 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19"} err="failed to get container status \"3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19\": rpc error: code = NotFound desc = could not find container \"3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19\": container with ID starting with 3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19 not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.913715 4804 scope.go:117] "RemoveContainer" containerID="88fbb69755113b5be170561fa4a81a646f0de4078219f8a5bd52d347e075ff44" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.914063 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88fbb69755113b5be170561fa4a81a646f0de4078219f8a5bd52d347e075ff44\": container with ID starting with 88fbb69755113b5be170561fa4a81a646f0de4078219f8a5bd52d347e075ff44 not found: ID does not exist" containerID="88fbb69755113b5be170561fa4a81a646f0de4078219f8a5bd52d347e075ff44" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.914090 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88fbb69755113b5be170561fa4a81a646f0de4078219f8a5bd52d347e075ff44"} err="failed to get container status \"88fbb69755113b5be170561fa4a81a646f0de4078219f8a5bd52d347e075ff44\": rpc error: code = NotFound desc = could not find container \"88fbb69755113b5be170561fa4a81a646f0de4078219f8a5bd52d347e075ff44\": container with ID starting with 88fbb69755113b5be170561fa4a81a646f0de4078219f8a5bd52d347e075ff44 not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.914420 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-54w49"] Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.612501 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.649779 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5fs82"] Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650003 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" containerName="extract-utilities" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650018 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" containerName="extract-utilities" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650031 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbda9f29-b199-4a42-8757-f5ecc90f0437" containerName="extract-content" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650038 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbda9f29-b199-4a42-8757-f5ecc90f0437" containerName="extract-content" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650047 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650054 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650063 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbda9f29-b199-4a42-8757-f5ecc90f0437" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650070 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbda9f29-b199-4a42-8757-f5ecc90f0437" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650081 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" containerName="extract-content" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650088 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" containerName="extract-content" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650099 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" containerName="extract-utilities" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650107 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" containerName="extract-utilities" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650122 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbda9f29-b199-4a42-8757-f5ecc90f0437" containerName="extract-utilities" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650130 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbda9f29-b199-4a42-8757-f5ecc90f0437" containerName="extract-utilities" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650139 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" containerName="extract-content" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650148 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" containerName="extract-content" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650159 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" containerName="extract-content" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650167 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" containerName="extract-content" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650179 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" containerName="extract-utilities" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650186 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" containerName="extract-utilities" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650214 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ce6eded-da13-4bb7-a87d-71b87d0e7f06" containerName="marketplace-operator" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650224 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce6eded-da13-4bb7-a87d-71b87d0e7f06" containerName="marketplace-operator" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650235 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650242 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650254 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650262 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650402 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650417 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbda9f29-b199-4a42-8757-f5ecc90f0437" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650429 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650443 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650453 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ce6eded-da13-4bb7-a87d-71b87d0e7f06" containerName="marketplace-operator" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.651308 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.654912 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.667270 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fs82"] Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.749179 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrkg9\" (UniqueName: \"kubernetes.io/projected/e7d80260-64fd-4975-a620-5c515a765fd3-kube-api-access-wrkg9\") pod \"redhat-marketplace-5fs82\" (UID: \"e7d80260-64fd-4975-a620-5c515a765fd3\") " pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.749306 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7d80260-64fd-4975-a620-5c515a765fd3-utilities\") pod \"redhat-marketplace-5fs82\" (UID: \"e7d80260-64fd-4975-a620-5c515a765fd3\") " pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.749363 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7d80260-64fd-4975-a620-5c515a765fd3-catalog-content\") pod \"redhat-marketplace-5fs82\" (UID: \"e7d80260-64fd-4975-a620-5c515a765fd3\") " pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.851149 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrkg9\" (UniqueName: \"kubernetes.io/projected/e7d80260-64fd-4975-a620-5c515a765fd3-kube-api-access-wrkg9\") pod \"redhat-marketplace-5fs82\" (UID: \"e7d80260-64fd-4975-a620-5c515a765fd3\") " pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.851229 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7d80260-64fd-4975-a620-5c515a765fd3-utilities\") pod \"redhat-marketplace-5fs82\" (UID: \"e7d80260-64fd-4975-a620-5c515a765fd3\") " pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.851263 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7d80260-64fd-4975-a620-5c515a765fd3-catalog-content\") pod \"redhat-marketplace-5fs82\" (UID: \"e7d80260-64fd-4975-a620-5c515a765fd3\") " pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.851747 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7d80260-64fd-4975-a620-5c515a765fd3-catalog-content\") pod \"redhat-marketplace-5fs82\" (UID: \"e7d80260-64fd-4975-a620-5c515a765fd3\") " pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.851833 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7d80260-64fd-4975-a620-5c515a765fd3-utilities\") pod \"redhat-marketplace-5fs82\" (UID: \"e7d80260-64fd-4975-a620-5c515a765fd3\") " pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.870466 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrkg9\" (UniqueName: \"kubernetes.io/projected/e7d80260-64fd-4975-a620-5c515a765fd3-kube-api-access-wrkg9\") pod \"redhat-marketplace-5fs82\" (UID: \"e7d80260-64fd-4975-a620-5c515a765fd3\") " pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.984488 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.229807 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bhcxz"] Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.231064 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.233359 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.243436 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bhcxz"] Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.356430 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7rn4\" (UniqueName: \"kubernetes.io/projected/fdf90149-055d-48ca-9336-ca6d6545f8a3-kube-api-access-l7rn4\") pod \"redhat-operators-bhcxz\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.356469 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-utilities\") pod \"redhat-operators-bhcxz\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.356522 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-catalog-content\") pod \"redhat-operators-bhcxz\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.452062 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fs82"] Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.457848 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7rn4\" (UniqueName: \"kubernetes.io/projected/fdf90149-055d-48ca-9336-ca6d6545f8a3-kube-api-access-l7rn4\") pod \"redhat-operators-bhcxz\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.457884 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-utilities\") pod \"redhat-operators-bhcxz\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.458061 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-catalog-content\") pod \"redhat-operators-bhcxz\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.458457 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-catalog-content\") pod \"redhat-operators-bhcxz\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.458671 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-utilities\") pod \"redhat-operators-bhcxz\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.476428 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7rn4\" (UniqueName: \"kubernetes.io/projected/fdf90149-055d-48ca-9336-ca6d6545f8a3-kube-api-access-l7rn4\") pod \"redhat-operators-bhcxz\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.556316 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.579706 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ce6eded-da13-4bb7-a87d-71b87d0e7f06" path="/var/lib/kubelet/pods/2ce6eded-da13-4bb7-a87d-71b87d0e7f06/volumes" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.580501 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" path="/var/lib/kubelet/pods/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5/volumes" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.581310 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" path="/var/lib/kubelet/pods/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df/volumes" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.582712 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" path="/var/lib/kubelet/pods/6a10f4e7-7906-43aa-98fb-e709a71a55d2/volumes" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.583606 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbda9f29-b199-4a42-8757-f5ecc90f0437" path="/var/lib/kubelet/pods/cbda9f29-b199-4a42-8757-f5ecc90f0437/volumes" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.617582 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fs82" event={"ID":"e7d80260-64fd-4975-a620-5c515a765fd3","Type":"ContainerStarted","Data":"4cc6d7c51e418b43a501474af2e9b9b60e06a16040f7d822d8e1b2cea5711db9"} Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.617642 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fs82" event={"ID":"e7d80260-64fd-4975-a620-5c515a765fd3","Type":"ContainerStarted","Data":"aab8a93b42e209f9c0896ccbc83840d676b303447174bf2e5277b7db9ef5ce9c"} Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.771317 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bhcxz"] Feb 17 13:31:48 crc kubenswrapper[4804]: W0217 13:31:48.777208 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdf90149_055d_48ca_9336_ca6d6545f8a3.slice/crio-b36f1a9d3bb9bce0d65ed7aea0a70bcace69dc6992d2df01c00c6c4e740bc208 WatchSource:0}: Error finding container b36f1a9d3bb9bce0d65ed7aea0a70bcace69dc6992d2df01c00c6c4e740bc208: Status 404 returned error can't find the container with id b36f1a9d3bb9bce0d65ed7aea0a70bcace69dc6992d2df01c00c6c4e740bc208 Feb 17 13:31:49 crc kubenswrapper[4804]: I0217 13:31:49.629718 4804 generic.go:334] "Generic (PLEG): container finished" podID="e7d80260-64fd-4975-a620-5c515a765fd3" containerID="4cc6d7c51e418b43a501474af2e9b9b60e06a16040f7d822d8e1b2cea5711db9" exitCode=0 Feb 17 13:31:49 crc kubenswrapper[4804]: I0217 13:31:49.629782 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fs82" event={"ID":"e7d80260-64fd-4975-a620-5c515a765fd3","Type":"ContainerDied","Data":"4cc6d7c51e418b43a501474af2e9b9b60e06a16040f7d822d8e1b2cea5711db9"} Feb 17 13:31:49 crc kubenswrapper[4804]: I0217 13:31:49.631437 4804 generic.go:334] "Generic (PLEG): container finished" podID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerID="aec9aafaeb0231fd50b93156ef23ec8d4f34ac9ec3ae7c91631e24543663c093" exitCode=0 Feb 17 13:31:49 crc kubenswrapper[4804]: I0217 13:31:49.631464 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhcxz" event={"ID":"fdf90149-055d-48ca-9336-ca6d6545f8a3","Type":"ContainerDied","Data":"aec9aafaeb0231fd50b93156ef23ec8d4f34ac9ec3ae7c91631e24543663c093"} Feb 17 13:31:49 crc kubenswrapper[4804]: I0217 13:31:49.631488 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhcxz" event={"ID":"fdf90149-055d-48ca-9336-ca6d6545f8a3","Type":"ContainerStarted","Data":"b36f1a9d3bb9bce0d65ed7aea0a70bcace69dc6992d2df01c00c6c4e740bc208"} Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.027576 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jhxhx"] Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.029594 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.031849 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.037668 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jhxhx"] Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.077367 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt9p2\" (UniqueName: \"kubernetes.io/projected/5816c991-ba5a-4d3c-9d69-d28846ca92f6-kube-api-access-zt9p2\") pod \"certified-operators-jhxhx\" (UID: \"5816c991-ba5a-4d3c-9d69-d28846ca92f6\") " pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.077425 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5816c991-ba5a-4d3c-9d69-d28846ca92f6-catalog-content\") pod \"certified-operators-jhxhx\" (UID: \"5816c991-ba5a-4d3c-9d69-d28846ca92f6\") " pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.077478 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5816c991-ba5a-4d3c-9d69-d28846ca92f6-utilities\") pod \"certified-operators-jhxhx\" (UID: \"5816c991-ba5a-4d3c-9d69-d28846ca92f6\") " pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.179035 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5816c991-ba5a-4d3c-9d69-d28846ca92f6-utilities\") pod \"certified-operators-jhxhx\" (UID: \"5816c991-ba5a-4d3c-9d69-d28846ca92f6\") " pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.179119 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt9p2\" (UniqueName: \"kubernetes.io/projected/5816c991-ba5a-4d3c-9d69-d28846ca92f6-kube-api-access-zt9p2\") pod \"certified-operators-jhxhx\" (UID: \"5816c991-ba5a-4d3c-9d69-d28846ca92f6\") " pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.179148 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5816c991-ba5a-4d3c-9d69-d28846ca92f6-catalog-content\") pod \"certified-operators-jhxhx\" (UID: \"5816c991-ba5a-4d3c-9d69-d28846ca92f6\") " pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.179587 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5816c991-ba5a-4d3c-9d69-d28846ca92f6-catalog-content\") pod \"certified-operators-jhxhx\" (UID: \"5816c991-ba5a-4d3c-9d69-d28846ca92f6\") " pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.181225 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5816c991-ba5a-4d3c-9d69-d28846ca92f6-utilities\") pod \"certified-operators-jhxhx\" (UID: \"5816c991-ba5a-4d3c-9d69-d28846ca92f6\") " pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.201863 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt9p2\" (UniqueName: \"kubernetes.io/projected/5816c991-ba5a-4d3c-9d69-d28846ca92f6-kube-api-access-zt9p2\") pod \"certified-operators-jhxhx\" (UID: \"5816c991-ba5a-4d3c-9d69-d28846ca92f6\") " pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.349186 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.628987 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m2bjw"] Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.633985 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.638267 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.640087 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m2bjw"] Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.652938 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhcxz" event={"ID":"fdf90149-055d-48ca-9336-ca6d6545f8a3","Type":"ContainerStarted","Data":"655e7850618eb7f1a6d3ae03ba1313c40721cf71550535d385a4aa123058d615"} Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.657489 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fs82" event={"ID":"e7d80260-64fd-4975-a620-5c515a765fd3","Type":"ContainerStarted","Data":"fa2f3974c7128503ab67f47b8f0f2c135f4217d52547f1e2d0231f564911984b"} Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.684817 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57d3429b-b2f5-49ea-94b2-b79aa1769367-catalog-content\") pod \"community-operators-m2bjw\" (UID: \"57d3429b-b2f5-49ea-94b2-b79aa1769367\") " pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.685036 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57d3429b-b2f5-49ea-94b2-b79aa1769367-utilities\") pod \"community-operators-m2bjw\" (UID: \"57d3429b-b2f5-49ea-94b2-b79aa1769367\") " pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.685281 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k8vn\" (UniqueName: \"kubernetes.io/projected/57d3429b-b2f5-49ea-94b2-b79aa1769367-kube-api-access-5k8vn\") pod \"community-operators-m2bjw\" (UID: \"57d3429b-b2f5-49ea-94b2-b79aa1769367\") " pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.771155 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jhxhx"] Feb 17 13:31:50 crc kubenswrapper[4804]: W0217 13:31:50.774353 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5816c991_ba5a_4d3c_9d69_d28846ca92f6.slice/crio-44dc911d44a2a4a6821779b374fb03d80db8333ee5f639859058e7be18d2596c WatchSource:0}: Error finding container 44dc911d44a2a4a6821779b374fb03d80db8333ee5f639859058e7be18d2596c: Status 404 returned error can't find the container with id 44dc911d44a2a4a6821779b374fb03d80db8333ee5f639859058e7be18d2596c Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.786336 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k8vn\" (UniqueName: \"kubernetes.io/projected/57d3429b-b2f5-49ea-94b2-b79aa1769367-kube-api-access-5k8vn\") pod \"community-operators-m2bjw\" (UID: \"57d3429b-b2f5-49ea-94b2-b79aa1769367\") " pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.786622 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57d3429b-b2f5-49ea-94b2-b79aa1769367-catalog-content\") pod \"community-operators-m2bjw\" (UID: \"57d3429b-b2f5-49ea-94b2-b79aa1769367\") " pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.786694 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57d3429b-b2f5-49ea-94b2-b79aa1769367-utilities\") pod \"community-operators-m2bjw\" (UID: \"57d3429b-b2f5-49ea-94b2-b79aa1769367\") " pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.787253 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57d3429b-b2f5-49ea-94b2-b79aa1769367-utilities\") pod \"community-operators-m2bjw\" (UID: \"57d3429b-b2f5-49ea-94b2-b79aa1769367\") " pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.787552 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57d3429b-b2f5-49ea-94b2-b79aa1769367-catalog-content\") pod \"community-operators-m2bjw\" (UID: \"57d3429b-b2f5-49ea-94b2-b79aa1769367\") " pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.806277 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k8vn\" (UniqueName: \"kubernetes.io/projected/57d3429b-b2f5-49ea-94b2-b79aa1769367-kube-api-access-5k8vn\") pod \"community-operators-m2bjw\" (UID: \"57d3429b-b2f5-49ea-94b2-b79aa1769367\") " pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.966240 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:31:51 crc kubenswrapper[4804]: I0217 13:31:51.355425 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m2bjw"] Feb 17 13:31:51 crc kubenswrapper[4804]: I0217 13:31:51.664348 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2bjw" event={"ID":"57d3429b-b2f5-49ea-94b2-b79aa1769367","Type":"ContainerStarted","Data":"bfdfc2d7fc6905547354dfb774070a83182c87e28316ab2f18ad07677a3e9bbb"} Feb 17 13:31:51 crc kubenswrapper[4804]: I0217 13:31:51.665928 4804 generic.go:334] "Generic (PLEG): container finished" podID="e7d80260-64fd-4975-a620-5c515a765fd3" containerID="fa2f3974c7128503ab67f47b8f0f2c135f4217d52547f1e2d0231f564911984b" exitCode=0 Feb 17 13:31:51 crc kubenswrapper[4804]: I0217 13:31:51.665993 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fs82" event={"ID":"e7d80260-64fd-4975-a620-5c515a765fd3","Type":"ContainerDied","Data":"fa2f3974c7128503ab67f47b8f0f2c135f4217d52547f1e2d0231f564911984b"} Feb 17 13:31:51 crc kubenswrapper[4804]: I0217 13:31:51.674043 4804 generic.go:334] "Generic (PLEG): container finished" podID="5816c991-ba5a-4d3c-9d69-d28846ca92f6" containerID="e367fd4183d97d52042e7b9188c938e6f12d4820fbce7a04c1773ea2248fb662" exitCode=0 Feb 17 13:31:51 crc kubenswrapper[4804]: I0217 13:31:51.674101 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jhxhx" event={"ID":"5816c991-ba5a-4d3c-9d69-d28846ca92f6","Type":"ContainerDied","Data":"e367fd4183d97d52042e7b9188c938e6f12d4820fbce7a04c1773ea2248fb662"} Feb 17 13:31:51 crc kubenswrapper[4804]: I0217 13:31:51.674130 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jhxhx" event={"ID":"5816c991-ba5a-4d3c-9d69-d28846ca92f6","Type":"ContainerStarted","Data":"44dc911d44a2a4a6821779b374fb03d80db8333ee5f639859058e7be18d2596c"} Feb 17 13:31:51 crc kubenswrapper[4804]: I0217 13:31:51.677085 4804 generic.go:334] "Generic (PLEG): container finished" podID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerID="655e7850618eb7f1a6d3ae03ba1313c40721cf71550535d385a4aa123058d615" exitCode=0 Feb 17 13:31:51 crc kubenswrapper[4804]: I0217 13:31:51.677125 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhcxz" event={"ID":"fdf90149-055d-48ca-9336-ca6d6545f8a3","Type":"ContainerDied","Data":"655e7850618eb7f1a6d3ae03ba1313c40721cf71550535d385a4aa123058d615"} Feb 17 13:31:52 crc kubenswrapper[4804]: I0217 13:31:52.683182 4804 generic.go:334] "Generic (PLEG): container finished" podID="57d3429b-b2f5-49ea-94b2-b79aa1769367" containerID="f920816953a1e71425cd0949e078b20754c9607ca1084d5d38622e84385b81f6" exitCode=0 Feb 17 13:31:52 crc kubenswrapper[4804]: I0217 13:31:52.683308 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2bjw" event={"ID":"57d3429b-b2f5-49ea-94b2-b79aa1769367","Type":"ContainerDied","Data":"f920816953a1e71425cd0949e078b20754c9607ca1084d5d38622e84385b81f6"} Feb 17 13:31:53 crc kubenswrapper[4804]: I0217 13:31:53.691757 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fs82" event={"ID":"e7d80260-64fd-4975-a620-5c515a765fd3","Type":"ContainerStarted","Data":"61a1703a0dbfcc1dcb3006d155fd893895d87f6e58c62c0e9ff3c6f1569d9df3"} Feb 17 13:31:54 crc kubenswrapper[4804]: I0217 13:31:54.698116 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2bjw" event={"ID":"57d3429b-b2f5-49ea-94b2-b79aa1769367","Type":"ContainerStarted","Data":"bfed1b1fec8bd92d2c322cf6498a26e10ea50d3847ab60bd2d34adae8689a746"} Feb 17 13:31:54 crc kubenswrapper[4804]: I0217 13:31:54.703639 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jhxhx" event={"ID":"5816c991-ba5a-4d3c-9d69-d28846ca92f6","Type":"ContainerStarted","Data":"b9933b0363f7f4e4a5625db8e26a5f4a9a76ce22cf10be62a4bd19f9e6534fbd"} Feb 17 13:31:54 crc kubenswrapper[4804]: I0217 13:31:54.705567 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhcxz" event={"ID":"fdf90149-055d-48ca-9336-ca6d6545f8a3","Type":"ContainerStarted","Data":"99e2aa4e9ffd4764c886e89b267517bc69e0446a4dde7f269ace85ac34cf8bca"} Feb 17 13:31:54 crc kubenswrapper[4804]: I0217 13:31:54.725055 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5fs82" podStartSLOduration=4.480414735 podStartE2EDuration="7.72503295s" podCreationTimestamp="2026-02-17 13:31:47 +0000 UTC" firstStartedPulling="2026-02-17 13:31:49.631593715 +0000 UTC m=+383.743013052" lastFinishedPulling="2026-02-17 13:31:52.87621191 +0000 UTC m=+386.987631267" observedRunningTime="2026-02-17 13:31:53.711526375 +0000 UTC m=+387.822945712" watchObservedRunningTime="2026-02-17 13:31:54.72503295 +0000 UTC m=+388.836452297" Feb 17 13:31:54 crc kubenswrapper[4804]: I0217 13:31:54.742860 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bhcxz" podStartSLOduration=2.173845965 podStartE2EDuration="6.74284674s" podCreationTimestamp="2026-02-17 13:31:48 +0000 UTC" firstStartedPulling="2026-02-17 13:31:49.633064064 +0000 UTC m=+383.744483401" lastFinishedPulling="2026-02-17 13:31:54.202064839 +0000 UTC m=+388.313484176" observedRunningTime="2026-02-17 13:31:54.741345911 +0000 UTC m=+388.852765248" watchObservedRunningTime="2026-02-17 13:31:54.74284674 +0000 UTC m=+388.854266077" Feb 17 13:31:55 crc kubenswrapper[4804]: I0217 13:31:55.712084 4804 generic.go:334] "Generic (PLEG): container finished" podID="57d3429b-b2f5-49ea-94b2-b79aa1769367" containerID="bfed1b1fec8bd92d2c322cf6498a26e10ea50d3847ab60bd2d34adae8689a746" exitCode=0 Feb 17 13:31:55 crc kubenswrapper[4804]: I0217 13:31:55.712151 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2bjw" event={"ID":"57d3429b-b2f5-49ea-94b2-b79aa1769367","Type":"ContainerDied","Data":"bfed1b1fec8bd92d2c322cf6498a26e10ea50d3847ab60bd2d34adae8689a746"} Feb 17 13:31:55 crc kubenswrapper[4804]: I0217 13:31:55.714380 4804 generic.go:334] "Generic (PLEG): container finished" podID="5816c991-ba5a-4d3c-9d69-d28846ca92f6" containerID="b9933b0363f7f4e4a5625db8e26a5f4a9a76ce22cf10be62a4bd19f9e6534fbd" exitCode=0 Feb 17 13:31:55 crc kubenswrapper[4804]: I0217 13:31:55.714438 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jhxhx" event={"ID":"5816c991-ba5a-4d3c-9d69-d28846ca92f6","Type":"ContainerDied","Data":"b9933b0363f7f4e4a5625db8e26a5f4a9a76ce22cf10be62a4bd19f9e6534fbd"} Feb 17 13:31:55 crc kubenswrapper[4804]: I0217 13:31:55.835456 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:31:55 crc kubenswrapper[4804]: I0217 13:31:55.835749 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:31:56 crc kubenswrapper[4804]: I0217 13:31:56.721473 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jhxhx" event={"ID":"5816c991-ba5a-4d3c-9d69-d28846ca92f6","Type":"ContainerStarted","Data":"ade1fc4444eea0de1f2fff03c43e9b9a9b528f44acfc95be11d7194dc1810c81"} Feb 17 13:31:56 crc kubenswrapper[4804]: I0217 13:31:56.724706 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2bjw" event={"ID":"57d3429b-b2f5-49ea-94b2-b79aa1769367","Type":"ContainerStarted","Data":"d8876f80d0cff4e05b2bb9d76059eed2f80bd1c407188ba2adf986e6b194f57e"} Feb 17 13:31:56 crc kubenswrapper[4804]: I0217 13:31:56.749452 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jhxhx" podStartSLOduration=2.190632011 podStartE2EDuration="6.749432373s" podCreationTimestamp="2026-02-17 13:31:50 +0000 UTC" firstStartedPulling="2026-02-17 13:31:51.675304977 +0000 UTC m=+385.786724314" lastFinishedPulling="2026-02-17 13:31:56.234105329 +0000 UTC m=+390.345524676" observedRunningTime="2026-02-17 13:31:56.7478151 +0000 UTC m=+390.859234437" watchObservedRunningTime="2026-02-17 13:31:56.749432373 +0000 UTC m=+390.860851710" Feb 17 13:31:56 crc kubenswrapper[4804]: I0217 13:31:56.773797 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m2bjw" podStartSLOduration=3.086527499 podStartE2EDuration="6.773777011s" podCreationTimestamp="2026-02-17 13:31:50 +0000 UTC" firstStartedPulling="2026-02-17 13:31:52.685777303 +0000 UTC m=+386.797196630" lastFinishedPulling="2026-02-17 13:31:56.373026805 +0000 UTC m=+390.484446142" observedRunningTime="2026-02-17 13:31:56.77102933 +0000 UTC m=+390.882448677" watchObservedRunningTime="2026-02-17 13:31:56.773777011 +0000 UTC m=+390.885196348" Feb 17 13:31:57 crc kubenswrapper[4804]: I0217 13:31:57.985011 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:57 crc kubenswrapper[4804]: I0217 13:31:57.985368 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:58 crc kubenswrapper[4804]: I0217 13:31:58.067034 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:58 crc kubenswrapper[4804]: I0217 13:31:58.556544 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:58 crc kubenswrapper[4804]: I0217 13:31:58.556597 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:58 crc kubenswrapper[4804]: I0217 13:31:58.781932 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:59 crc kubenswrapper[4804]: I0217 13:31:59.598548 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bhcxz" podUID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerName="registry-server" probeResult="failure" output=< Feb 17 13:31:59 crc kubenswrapper[4804]: timeout: failed to connect service ":50051" within 1s Feb 17 13:31:59 crc kubenswrapper[4804]: > Feb 17 13:32:00 crc kubenswrapper[4804]: I0217 13:32:00.349641 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:32:00 crc kubenswrapper[4804]: I0217 13:32:00.349936 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:32:00 crc kubenswrapper[4804]: I0217 13:32:00.389585 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:32:00 crc kubenswrapper[4804]: I0217 13:32:00.967176 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:32:00 crc kubenswrapper[4804]: I0217 13:32:00.967274 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:32:01 crc kubenswrapper[4804]: I0217 13:32:01.018124 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:32:01 crc kubenswrapper[4804]: I0217 13:32:01.791560 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:32:05 crc kubenswrapper[4804]: I0217 13:32:05.908925 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:32:05 crc kubenswrapper[4804]: I0217 13:32:05.968559 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggf6k"] Feb 17 13:32:08 crc kubenswrapper[4804]: I0217 13:32:08.623048 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:32:08 crc kubenswrapper[4804]: I0217 13:32:08.698487 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:32:10 crc kubenswrapper[4804]: I0217 13:32:10.397698 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:32:25 crc kubenswrapper[4804]: I0217 13:32:25.835268 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:32:25 crc kubenswrapper[4804]: I0217 13:32:25.838049 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:32:25 crc kubenswrapper[4804]: I0217 13:32:25.838341 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:32:25 crc kubenswrapper[4804]: I0217 13:32:25.839456 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"45ca7a269d38a09ffdae5bd556d7eb92def23c8a4cf6319c270308b20dd056c7"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 13:32:25 crc kubenswrapper[4804]: I0217 13:32:25.839749 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://45ca7a269d38a09ffdae5bd556d7eb92def23c8a4cf6319c270308b20dd056c7" gracePeriod=600 Feb 17 13:32:26 crc kubenswrapper[4804]: I0217 13:32:26.908245 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="45ca7a269d38a09ffdae5bd556d7eb92def23c8a4cf6319c270308b20dd056c7" exitCode=0 Feb 17 13:32:26 crc kubenswrapper[4804]: I0217 13:32:26.908307 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"45ca7a269d38a09ffdae5bd556d7eb92def23c8a4cf6319c270308b20dd056c7"} Feb 17 13:32:26 crc kubenswrapper[4804]: I0217 13:32:26.908839 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"5551daa5df1d49c4efe0e9ec2a10b66e2ea57db472e185436f7abf62112f226d"} Feb 17 13:32:26 crc kubenswrapper[4804]: I0217 13:32:26.908869 4804 scope.go:117] "RemoveContainer" containerID="526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.012860 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" podUID="b09fea83-e0d3-4a40-b186-8432c3fa7be0" containerName="registry" containerID="cri-o://9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9" gracePeriod=30 Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.593271 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.674568 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b09fea83-e0d3-4a40-b186-8432c3fa7be0-ca-trust-extracted\") pod \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.674713 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-trusted-ca\") pod \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.674784 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-certificates\") pod \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.674825 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-tls\") pod \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.674856 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqx9v\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-kube-api-access-cqx9v\") pod \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.674896 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-bound-sa-token\") pod \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.675080 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.675129 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b09fea83-e0d3-4a40-b186-8432c3fa7be0-installation-pull-secrets\") pod \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.675568 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b09fea83-e0d3-4a40-b186-8432c3fa7be0" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.675706 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b09fea83-e0d3-4a40-b186-8432c3fa7be0" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.680730 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b09fea83-e0d3-4a40-b186-8432c3fa7be0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b09fea83-e0d3-4a40-b186-8432c3fa7be0" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.681113 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b09fea83-e0d3-4a40-b186-8432c3fa7be0" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.681440 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-kube-api-access-cqx9v" (OuterVolumeSpecName: "kube-api-access-cqx9v") pod "b09fea83-e0d3-4a40-b186-8432c3fa7be0" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0"). InnerVolumeSpecName "kube-api-access-cqx9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.681631 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b09fea83-e0d3-4a40-b186-8432c3fa7be0" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.691870 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b09fea83-e0d3-4a40-b186-8432c3fa7be0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b09fea83-e0d3-4a40-b186-8432c3fa7be0" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.707847 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "b09fea83-e0d3-4a40-b186-8432c3fa7be0" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.777074 4804 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.777125 4804 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.777140 4804 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.777151 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqx9v\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-kube-api-access-cqx9v\") on node \"crc\" DevicePath \"\"" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.777164 4804 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.777177 4804 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b09fea83-e0d3-4a40-b186-8432c3fa7be0-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.777189 4804 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b09fea83-e0d3-4a40-b186-8432c3fa7be0-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.941096 4804 generic.go:334] "Generic (PLEG): container finished" podID="b09fea83-e0d3-4a40-b186-8432c3fa7be0" containerID="9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9" exitCode=0 Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.941133 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" event={"ID":"b09fea83-e0d3-4a40-b186-8432c3fa7be0","Type":"ContainerDied","Data":"9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9"} Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.941161 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" event={"ID":"b09fea83-e0d3-4a40-b186-8432c3fa7be0","Type":"ContainerDied","Data":"4dd741b3c38a0505bebb7c99e18c919af01e075e7767edd7ca2356d4e858351e"} Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.941179 4804 scope.go:117] "RemoveContainer" containerID="9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.941329 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.961411 4804 scope.go:117] "RemoveContainer" containerID="9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9" Feb 17 13:32:31 crc kubenswrapper[4804]: E0217 13:32:31.961902 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9\": container with ID starting with 9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9 not found: ID does not exist" containerID="9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.961934 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9"} err="failed to get container status \"9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9\": rpc error: code = NotFound desc = could not find container \"9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9\": container with ID starting with 9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9 not found: ID does not exist" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.994337 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggf6k"] Feb 17 13:32:32 crc kubenswrapper[4804]: I0217 13:32:32.001007 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggf6k"] Feb 17 13:32:32 crc kubenswrapper[4804]: I0217 13:32:32.586386 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b09fea83-e0d3-4a40-b186-8432c3fa7be0" path="/var/lib/kubelet/pods/b09fea83-e0d3-4a40-b186-8432c3fa7be0/volumes" Feb 17 13:34:55 crc kubenswrapper[4804]: I0217 13:34:55.836100 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:34:55 crc kubenswrapper[4804]: I0217 13:34:55.836778 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:35:25 crc kubenswrapper[4804]: I0217 13:35:25.835554 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:35:25 crc kubenswrapper[4804]: I0217 13:35:25.836159 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:35:26 crc kubenswrapper[4804]: I0217 13:35:26.778776 4804 scope.go:117] "RemoveContainer" containerID="a9ed597c3c00b14d9496b5cdcd3501fa4654fd60a6b054f4df6ff45fd2626a2f" Feb 17 13:35:55 crc kubenswrapper[4804]: I0217 13:35:55.835770 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:35:55 crc kubenswrapper[4804]: I0217 13:35:55.836503 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:35:55 crc kubenswrapper[4804]: I0217 13:35:55.836566 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:35:55 crc kubenswrapper[4804]: I0217 13:35:55.837396 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5551daa5df1d49c4efe0e9ec2a10b66e2ea57db472e185436f7abf62112f226d"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 13:35:55 crc kubenswrapper[4804]: I0217 13:35:55.837489 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://5551daa5df1d49c4efe0e9ec2a10b66e2ea57db472e185436f7abf62112f226d" gracePeriod=600 Feb 17 13:35:56 crc kubenswrapper[4804]: I0217 13:35:56.188776 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="5551daa5df1d49c4efe0e9ec2a10b66e2ea57db472e185436f7abf62112f226d" exitCode=0 Feb 17 13:35:56 crc kubenswrapper[4804]: I0217 13:35:56.188865 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"5551daa5df1d49c4efe0e9ec2a10b66e2ea57db472e185436f7abf62112f226d"} Feb 17 13:35:56 crc kubenswrapper[4804]: I0217 13:35:56.189113 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"8de55925453e9c90c2dd998f586db937cfd6d8bf2a763548f6a43c49f5395c8e"} Feb 17 13:35:56 crc kubenswrapper[4804]: I0217 13:35:56.189135 4804 scope.go:117] "RemoveContainer" containerID="45ca7a269d38a09ffdae5bd556d7eb92def23c8a4cf6319c270308b20dd056c7" Feb 17 13:38:03 crc kubenswrapper[4804]: I0217 13:38:03.817851 4804 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 13:38:25 crc kubenswrapper[4804]: I0217 13:38:25.835082 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:38:25 crc kubenswrapper[4804]: I0217 13:38:25.835802 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:38:55 crc kubenswrapper[4804]: I0217 13:38:55.834925 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:38:55 crc kubenswrapper[4804]: I0217 13:38:55.835611 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.938068 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-kbdz5"] Feb 17 13:39:21 crc kubenswrapper[4804]: E0217 13:39:21.938796 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b09fea83-e0d3-4a40-b186-8432c3fa7be0" containerName="registry" Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.938808 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="b09fea83-e0d3-4a40-b186-8432c3fa7be0" containerName="registry" Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.938923 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="b09fea83-e0d3-4a40-b186-8432c3fa7be0" containerName="registry" Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.939353 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kbdz5" Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.945652 4804 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-j5t89" Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.945734 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.945788 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.949327 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-7sfkb"] Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.950183 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-7sfkb" Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.952746 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-kbdz5"] Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.956076 4804 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-pzj97" Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.959563 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-7sfkb"] Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.982636 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-c8nh8"] Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.983658 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-c8nh8" Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.985686 4804 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-l8nlf" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.000324 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-c8nh8"] Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.011275 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wgbf\" (UniqueName: \"kubernetes.io/projected/be70f757-4537-489d-a86e-a1b49fc9af75-kube-api-access-7wgbf\") pod \"cert-manager-webhook-687f57d79b-c8nh8\" (UID: \"be70f757-4537-489d-a86e-a1b49fc9af75\") " pod="cert-manager/cert-manager-webhook-687f57d79b-c8nh8" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.011392 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chqqn\" (UniqueName: \"kubernetes.io/projected/9d2d8008-6348-4f24-8085-d30db8558ab3-kube-api-access-chqqn\") pod \"cert-manager-cainjector-cf98fcc89-kbdz5\" (UID: \"9d2d8008-6348-4f24-8085-d30db8558ab3\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-kbdz5" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.011658 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42s5l\" (UniqueName: \"kubernetes.io/projected/112c357f-f1dc-4a07-bba0-ddf54ab071ff-kube-api-access-42s5l\") pod \"cert-manager-858654f9db-7sfkb\" (UID: \"112c357f-f1dc-4a07-bba0-ddf54ab071ff\") " pod="cert-manager/cert-manager-858654f9db-7sfkb" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.112977 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42s5l\" (UniqueName: \"kubernetes.io/projected/112c357f-f1dc-4a07-bba0-ddf54ab071ff-kube-api-access-42s5l\") pod \"cert-manager-858654f9db-7sfkb\" (UID: \"112c357f-f1dc-4a07-bba0-ddf54ab071ff\") " pod="cert-manager/cert-manager-858654f9db-7sfkb" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.113529 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wgbf\" (UniqueName: \"kubernetes.io/projected/be70f757-4537-489d-a86e-a1b49fc9af75-kube-api-access-7wgbf\") pod \"cert-manager-webhook-687f57d79b-c8nh8\" (UID: \"be70f757-4537-489d-a86e-a1b49fc9af75\") " pod="cert-manager/cert-manager-webhook-687f57d79b-c8nh8" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.113799 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chqqn\" (UniqueName: \"kubernetes.io/projected/9d2d8008-6348-4f24-8085-d30db8558ab3-kube-api-access-chqqn\") pod \"cert-manager-cainjector-cf98fcc89-kbdz5\" (UID: \"9d2d8008-6348-4f24-8085-d30db8558ab3\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-kbdz5" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.134005 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42s5l\" (UniqueName: \"kubernetes.io/projected/112c357f-f1dc-4a07-bba0-ddf54ab071ff-kube-api-access-42s5l\") pod \"cert-manager-858654f9db-7sfkb\" (UID: \"112c357f-f1dc-4a07-bba0-ddf54ab071ff\") " pod="cert-manager/cert-manager-858654f9db-7sfkb" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.134290 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wgbf\" (UniqueName: \"kubernetes.io/projected/be70f757-4537-489d-a86e-a1b49fc9af75-kube-api-access-7wgbf\") pod \"cert-manager-webhook-687f57d79b-c8nh8\" (UID: \"be70f757-4537-489d-a86e-a1b49fc9af75\") " pod="cert-manager/cert-manager-webhook-687f57d79b-c8nh8" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.135150 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chqqn\" (UniqueName: \"kubernetes.io/projected/9d2d8008-6348-4f24-8085-d30db8558ab3-kube-api-access-chqqn\") pod \"cert-manager-cainjector-cf98fcc89-kbdz5\" (UID: \"9d2d8008-6348-4f24-8085-d30db8558ab3\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-kbdz5" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.266488 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kbdz5" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.278288 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-7sfkb" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.301334 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-c8nh8" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.583428 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-c8nh8"] Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.584486 4804 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.717401 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-kbdz5"] Feb 17 13:39:22 crc kubenswrapper[4804]: W0217 13:39:22.720684 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod112c357f_f1dc_4a07_bba0_ddf54ab071ff.slice/crio-682cda7d7c57a849494879eadac39bc4bbb373d1f8d21ffc47a38373417b487f WatchSource:0}: Error finding container 682cda7d7c57a849494879eadac39bc4bbb373d1f8d21ffc47a38373417b487f: Status 404 returned error can't find the container with id 682cda7d7c57a849494879eadac39bc4bbb373d1f8d21ffc47a38373417b487f Feb 17 13:39:22 crc kubenswrapper[4804]: W0217 13:39:22.720963 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d2d8008_6348_4f24_8085_d30db8558ab3.slice/crio-3ac2a541c69b6e55279e8db24111ce294f8bd7e70d15f3ad8a6daa4beb60eb7c WatchSource:0}: Error finding container 3ac2a541c69b6e55279e8db24111ce294f8bd7e70d15f3ad8a6daa4beb60eb7c: Status 404 returned error can't find the container with id 3ac2a541c69b6e55279e8db24111ce294f8bd7e70d15f3ad8a6daa4beb60eb7c Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.721390 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-7sfkb"] Feb 17 13:39:23 crc kubenswrapper[4804]: I0217 13:39:23.512681 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-c8nh8" event={"ID":"be70f757-4537-489d-a86e-a1b49fc9af75","Type":"ContainerStarted","Data":"f35a35053b8ad7ea21de12e2f8f4752ea28348753de71473652ddc0a8b819cc0"} Feb 17 13:39:23 crc kubenswrapper[4804]: I0217 13:39:23.514113 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kbdz5" event={"ID":"9d2d8008-6348-4f24-8085-d30db8558ab3","Type":"ContainerStarted","Data":"3ac2a541c69b6e55279e8db24111ce294f8bd7e70d15f3ad8a6daa4beb60eb7c"} Feb 17 13:39:23 crc kubenswrapper[4804]: I0217 13:39:23.515266 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-7sfkb" event={"ID":"112c357f-f1dc-4a07-bba0-ddf54ab071ff","Type":"ContainerStarted","Data":"682cda7d7c57a849494879eadac39bc4bbb373d1f8d21ffc47a38373417b487f"} Feb 17 13:39:25 crc kubenswrapper[4804]: I0217 13:39:25.528050 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-c8nh8" event={"ID":"be70f757-4537-489d-a86e-a1b49fc9af75","Type":"ContainerStarted","Data":"b86e05c21bbb75cbb64d4e55e95d54ac4310f8e63ca1474537f83c89b7356cf3"} Feb 17 13:39:25 crc kubenswrapper[4804]: I0217 13:39:25.528408 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-c8nh8" Feb 17 13:39:25 crc kubenswrapper[4804]: I0217 13:39:25.551298 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-c8nh8" podStartSLOduration=2.18264351 podStartE2EDuration="4.55127749s" podCreationTimestamp="2026-02-17 13:39:21 +0000 UTC" firstStartedPulling="2026-02-17 13:39:22.584309772 +0000 UTC m=+836.695729109" lastFinishedPulling="2026-02-17 13:39:24.952943732 +0000 UTC m=+839.064363089" observedRunningTime="2026-02-17 13:39:25.544650372 +0000 UTC m=+839.656069709" watchObservedRunningTime="2026-02-17 13:39:25.55127749 +0000 UTC m=+839.662696827" Feb 17 13:39:25 crc kubenswrapper[4804]: I0217 13:39:25.834984 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:39:25 crc kubenswrapper[4804]: I0217 13:39:25.835072 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:39:25 crc kubenswrapper[4804]: I0217 13:39:25.835137 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:39:25 crc kubenswrapper[4804]: I0217 13:39:25.836297 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8de55925453e9c90c2dd998f586db937cfd6d8bf2a763548f6a43c49f5395c8e"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 13:39:25 crc kubenswrapper[4804]: I0217 13:39:25.836405 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://8de55925453e9c90c2dd998f586db937cfd6d8bf2a763548f6a43c49f5395c8e" gracePeriod=600 Feb 17 13:39:26 crc kubenswrapper[4804]: I0217 13:39:26.533611 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kbdz5" event={"ID":"9d2d8008-6348-4f24-8085-d30db8558ab3","Type":"ContainerStarted","Data":"33e8ba71037253aef0a408726ffab75c3c60d12eebd0af388def5be96bf29eca"} Feb 17 13:39:26 crc kubenswrapper[4804]: I0217 13:39:26.535697 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-7sfkb" event={"ID":"112c357f-f1dc-4a07-bba0-ddf54ab071ff","Type":"ContainerStarted","Data":"5250fbfcb90f23c254a18fdfafde07d341317998eeb98431d0d8b10985a9c93a"} Feb 17 13:39:26 crc kubenswrapper[4804]: I0217 13:39:26.539509 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="8de55925453e9c90c2dd998f586db937cfd6d8bf2a763548f6a43c49f5395c8e" exitCode=0 Feb 17 13:39:26 crc kubenswrapper[4804]: I0217 13:39:26.539566 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"8de55925453e9c90c2dd998f586db937cfd6d8bf2a763548f6a43c49f5395c8e"} Feb 17 13:39:26 crc kubenswrapper[4804]: I0217 13:39:26.539596 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"0320866c2bb2dbd13ef711a6f5701e23927765988c8998787dbdeb879aaaaa69"} Feb 17 13:39:26 crc kubenswrapper[4804]: I0217 13:39:26.539612 4804 scope.go:117] "RemoveContainer" containerID="5551daa5df1d49c4efe0e9ec2a10b66e2ea57db472e185436f7abf62112f226d" Feb 17 13:39:26 crc kubenswrapper[4804]: I0217 13:39:26.551405 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kbdz5" podStartSLOduration=1.967981799 podStartE2EDuration="5.55138104s" podCreationTimestamp="2026-02-17 13:39:21 +0000 UTC" firstStartedPulling="2026-02-17 13:39:22.723344329 +0000 UTC m=+836.834763666" lastFinishedPulling="2026-02-17 13:39:26.30674357 +0000 UTC m=+840.418162907" observedRunningTime="2026-02-17 13:39:26.548115467 +0000 UTC m=+840.659534824" watchObservedRunningTime="2026-02-17 13:39:26.55138104 +0000 UTC m=+840.662800377" Feb 17 13:39:26 crc kubenswrapper[4804]: I0217 13:39:26.589818 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-7sfkb" podStartSLOduration=2.006151632 podStartE2EDuration="5.589800551s" podCreationTimestamp="2026-02-17 13:39:21 +0000 UTC" firstStartedPulling="2026-02-17 13:39:22.723342229 +0000 UTC m=+836.834761566" lastFinishedPulling="2026-02-17 13:39:26.306991128 +0000 UTC m=+840.418410485" observedRunningTime="2026-02-17 13:39:26.587912462 +0000 UTC m=+840.699331799" watchObservedRunningTime="2026-02-17 13:39:26.589800551 +0000 UTC m=+840.701219888" Feb 17 13:39:31 crc kubenswrapper[4804]: I0217 13:39:31.954126 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v8mv6"] Feb 17 13:39:31 crc kubenswrapper[4804]: I0217 13:39:31.955086 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovn-controller" containerID="cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79" gracePeriod=30 Feb 17 13:39:31 crc kubenswrapper[4804]: I0217 13:39:31.955137 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="nbdb" containerID="cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054" gracePeriod=30 Feb 17 13:39:31 crc kubenswrapper[4804]: I0217 13:39:31.955278 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="sbdb" containerID="cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd" gracePeriod=30 Feb 17 13:39:31 crc kubenswrapper[4804]: I0217 13:39:31.955246 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7" gracePeriod=30 Feb 17 13:39:31 crc kubenswrapper[4804]: I0217 13:39:31.955357 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="northd" containerID="cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb" gracePeriod=30 Feb 17 13:39:31 crc kubenswrapper[4804]: I0217 13:39:31.955329 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="kube-rbac-proxy-node" containerID="cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb" gracePeriod=30 Feb 17 13:39:31 crc kubenswrapper[4804]: I0217 13:39:31.955296 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovn-acl-logging" containerID="cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449" gracePeriod=30 Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.000449 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" containerID="cri-o://0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96" gracePeriod=30 Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.303194 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-c8nh8" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.356302 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/3.log" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.358819 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovn-acl-logging/0.log" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.359436 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovn-controller/0.log" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.359919 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.431796 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-q64t2"] Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.432174 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432247 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.432274 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="sbdb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432289 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="sbdb" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.432310 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432329 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.432349 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovn-acl-logging" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432364 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovn-acl-logging" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.432395 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432411 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.432433 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432449 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.432474 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="nbdb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432488 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="nbdb" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.432507 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovn-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432523 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovn-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.432545 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="kubecfg-setup" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432560 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="kubecfg-setup" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.432581 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="kube-rbac-proxy-node" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432597 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="kube-rbac-proxy-node" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.432624 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="northd" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432641 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="northd" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432850 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432879 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="sbdb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432898 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="nbdb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432922 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432941 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="kube-rbac-proxy-node" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432968 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovn-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432988 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="northd" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.433013 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.433032 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.433056 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.433079 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovn-acl-logging" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.433760 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.433800 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.433828 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.433847 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.434117 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435135 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-config\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435184 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-var-lib-openvswitch\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435236 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-script-lib\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435283 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-systemd-units\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435307 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8df4e52a-e578-472b-a6b3-418e9755714f-ovn-node-metrics-cert\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435332 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-netns\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435332 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435352 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435412 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435434 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-env-overrides\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435443 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435451 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435481 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-kubelet\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435507 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-etc-openvswitch\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435526 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-systemd\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435526 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435562 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-bin\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435549 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435590 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-ovn\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435617 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-slash\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435649 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-openvswitch\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435677 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435685 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-netd\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435712 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75nhb\" (UniqueName: \"kubernetes.io/projected/8df4e52a-e578-472b-a6b3-418e9755714f-kube-api-access-75nhb\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435678 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435744 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-ovn-kubernetes\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435707 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-slash" (OuterVolumeSpecName: "host-slash") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435707 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435769 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-log-socket\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435788 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-node-log\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435729 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435809 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435830 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435788 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435809 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-log-socket" (OuterVolumeSpecName: "log-socket") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435945 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-node-log" (OuterVolumeSpecName: "node-log") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436112 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436219 4804 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436234 4804 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-log-socket\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436245 4804 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-node-log\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436254 4804 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436267 4804 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436278 4804 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436289 4804 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436300 4804 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436311 4804 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436325 4804 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436339 4804 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436350 4804 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436360 4804 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436370 4804 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436379 4804 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-slash\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436389 4804 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436400 4804 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.437369 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.441016 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8df4e52a-e578-472b-a6b3-418e9755714f-kube-api-access-75nhb" (OuterVolumeSpecName: "kube-api-access-75nhb") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "kube-api-access-75nhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.441116 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8df4e52a-e578-472b-a6b3-418e9755714f-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.453951 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537567 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-systemd-units\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537609 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-etc-openvswitch\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537645 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-log-socket\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537668 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-cni-bin\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537741 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33fa0baa-0a4a-41c5-976e-5c7f60828272-env-overrides\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537789 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-cni-netd\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537806 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33fa0baa-0a4a-41c5-976e-5c7f60828272-ovn-node-metrics-cert\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537831 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-node-log\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537876 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-kubelet\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537891 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-run-systemd\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537936 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537957 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-run-ovn\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537972 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-run-ovn-kubernetes\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.538127 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-run-netns\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.538186 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33fa0baa-0a4a-41c5-976e-5c7f60828272-ovnkube-script-lib\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.538300 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33fa0baa-0a4a-41c5-976e-5c7f60828272-ovnkube-config\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.538328 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s245\" (UniqueName: \"kubernetes.io/projected/33fa0baa-0a4a-41c5-976e-5c7f60828272-kube-api-access-5s245\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.538375 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-run-openvswitch\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.538538 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-var-lib-openvswitch\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.538568 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-slash\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.538630 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75nhb\" (UniqueName: \"kubernetes.io/projected/8df4e52a-e578-472b-a6b3-418e9755714f-kube-api-access-75nhb\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.538645 4804 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8df4e52a-e578-472b-a6b3-418e9755714f-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.538654 4804 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.575166 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/3.log" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.577807 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovn-acl-logging/0.log" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.578926 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovn-controller/0.log" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.579312 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96" exitCode=0 Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.579337 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd" exitCode=0 Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.579363 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054" exitCode=0 Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.579371 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb" exitCode=0 Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.579378 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7" exitCode=0 Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.579386 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb" exitCode=0 Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.579393 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449" exitCode=143 Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.579401 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79" exitCode=143 Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.579438 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580559 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580604 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580621 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580636 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580650 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580661 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580673 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580689 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580695 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580703 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580709 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580715 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580721 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580728 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580735 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580746 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580756 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580763 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580769 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580775 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580783 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580789 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580795 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580801 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580808 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580815 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580825 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580836 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580855 4804 scope.go:117] "RemoveContainer" containerID="0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580864 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580967 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580978 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580985 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580992 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581000 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581006 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581013 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581020 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581031 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"a80f9c965ade76b1702626786407637ac7c475f156f06af4c297248b43c44248"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581044 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581051 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581057 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581063 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581069 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581075 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581082 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581089 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581095 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581102 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.582821 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/2.log" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.583311 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/1.log" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.583343 4804 generic.go:334] "Generic (PLEG): container finished" podID="42eec48d-c990-43e6-8348-d9f78997ec3b" containerID="89324956d07c3785169619878354108e896d85eaace9f0e642b1b5ee9a981bde" exitCode=2 Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.583370 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kclvs" event={"ID":"42eec48d-c990-43e6-8348-d9f78997ec3b","Type":"ContainerDied","Data":"89324956d07c3785169619878354108e896d85eaace9f0e642b1b5ee9a981bde"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.583391 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.584542 4804 scope.go:117] "RemoveContainer" containerID="89324956d07c3785169619878354108e896d85eaace9f0e642b1b5ee9a981bde" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.609638 4804 scope.go:117] "RemoveContainer" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.627360 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v8mv6"] Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.631833 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v8mv6"] Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.632044 4804 scope.go:117] "RemoveContainer" containerID="d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639426 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-var-lib-openvswitch\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639466 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-slash\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639497 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-systemd-units\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639519 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-etc-openvswitch\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639544 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-log-socket\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639568 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-cni-bin\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639592 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33fa0baa-0a4a-41c5-976e-5c7f60828272-env-overrides\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639613 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33fa0baa-0a4a-41c5-976e-5c7f60828272-ovn-node-metrics-cert\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639636 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-cni-netd\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639643 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-slash\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639655 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-var-lib-openvswitch\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639656 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-node-log\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639724 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-node-log\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639724 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-systemd-units\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639778 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-log-socket\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639856 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-etc-openvswitch\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639858 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-cni-bin\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639880 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-cni-netd\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640054 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-kubelet\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640088 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-run-systemd\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640164 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640220 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-run-systemd\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640222 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-kubelet\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640243 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-run-ovn-kubernetes\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640270 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-run-ovn\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640315 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-run-ovn\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640295 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-run-ovn-kubernetes\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640273 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640411 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-run-netns\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640440 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33fa0baa-0a4a-41c5-976e-5c7f60828272-ovnkube-script-lib\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640517 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33fa0baa-0a4a-41c5-976e-5c7f60828272-ovnkube-config\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640579 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s245\" (UniqueName: \"kubernetes.io/projected/33fa0baa-0a4a-41c5-976e-5c7f60828272-kube-api-access-5s245\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640638 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-run-openvswitch\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640949 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33fa0baa-0a4a-41c5-976e-5c7f60828272-env-overrides\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640527 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-run-netns\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.641060 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-run-openvswitch\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.641243 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33fa0baa-0a4a-41c5-976e-5c7f60828272-ovnkube-script-lib\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.642989 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33fa0baa-0a4a-41c5-976e-5c7f60828272-ovnkube-config\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.645741 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33fa0baa-0a4a-41c5-976e-5c7f60828272-ovn-node-metrics-cert\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.662657 4804 scope.go:117] "RemoveContainer" containerID="cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.664642 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s245\" (UniqueName: \"kubernetes.io/projected/33fa0baa-0a4a-41c5-976e-5c7f60828272-kube-api-access-5s245\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.681879 4804 scope.go:117] "RemoveContainer" containerID="d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.698918 4804 scope.go:117] "RemoveContainer" containerID="6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.713605 4804 scope.go:117] "RemoveContainer" containerID="94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.728451 4804 scope.go:117] "RemoveContainer" containerID="8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.741529 4804 scope.go:117] "RemoveContainer" containerID="6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.754045 4804 scope.go:117] "RemoveContainer" containerID="cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.754783 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.780400 4804 scope.go:117] "RemoveContainer" containerID="0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.780906 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96\": container with ID starting with 0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96 not found: ID does not exist" containerID="0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.780938 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96"} err="failed to get container status \"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96\": rpc error: code = NotFound desc = could not find container \"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96\": container with ID starting with 0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.780962 4804 scope.go:117] "RemoveContainer" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.781441 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\": container with ID starting with 6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe not found: ID does not exist" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.781462 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe"} err="failed to get container status \"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\": rpc error: code = NotFound desc = could not find container \"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\": container with ID starting with 6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.781474 4804 scope.go:117] "RemoveContainer" containerID="d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.781763 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\": container with ID starting with d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd not found: ID does not exist" containerID="d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.781876 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd"} err="failed to get container status \"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\": rpc error: code = NotFound desc = could not find container \"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\": container with ID starting with d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.781986 4804 scope.go:117] "RemoveContainer" containerID="cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.782360 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\": container with ID starting with cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054 not found: ID does not exist" containerID="cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.782382 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054"} err="failed to get container status \"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\": rpc error: code = NotFound desc = could not find container \"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\": container with ID starting with cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.782398 4804 scope.go:117] "RemoveContainer" containerID="d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.782703 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\": container with ID starting with d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb not found: ID does not exist" containerID="d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.782809 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb"} err="failed to get container status \"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\": rpc error: code = NotFound desc = could not find container \"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\": container with ID starting with d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.782889 4804 scope.go:117] "RemoveContainer" containerID="6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.783212 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\": container with ID starting with 6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7 not found: ID does not exist" containerID="6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.783249 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7"} err="failed to get container status \"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\": rpc error: code = NotFound desc = could not find container \"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\": container with ID starting with 6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.783269 4804 scope.go:117] "RemoveContainer" containerID="94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.783541 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\": container with ID starting with 94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb not found: ID does not exist" containerID="94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.783635 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb"} err="failed to get container status \"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\": rpc error: code = NotFound desc = could not find container \"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\": container with ID starting with 94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.783726 4804 scope.go:117] "RemoveContainer" containerID="8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449" Feb 17 13:39:32 crc kubenswrapper[4804]: W0217 13:39:32.783850 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33fa0baa_0a4a_41c5_976e_5c7f60828272.slice/crio-1f7802bb10116c11f42c676d5fbd1cacfd87b9d9cf4ea7f8b5ddab719e593d62 WatchSource:0}: Error finding container 1f7802bb10116c11f42c676d5fbd1cacfd87b9d9cf4ea7f8b5ddab719e593d62: Status 404 returned error can't find the container with id 1f7802bb10116c11f42c676d5fbd1cacfd87b9d9cf4ea7f8b5ddab719e593d62 Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.784166 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\": container with ID starting with 8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449 not found: ID does not exist" containerID="8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.784255 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449"} err="failed to get container status \"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\": rpc error: code = NotFound desc = could not find container \"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\": container with ID starting with 8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.784285 4804 scope.go:117] "RemoveContainer" containerID="6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.784577 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\": container with ID starting with 6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79 not found: ID does not exist" containerID="6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.784603 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79"} err="failed to get container status \"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\": rpc error: code = NotFound desc = could not find container \"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\": container with ID starting with 6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.784618 4804 scope.go:117] "RemoveContainer" containerID="cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.784953 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\": container with ID starting with cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44 not found: ID does not exist" containerID="cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.784996 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44"} err="failed to get container status \"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\": rpc error: code = NotFound desc = could not find container \"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\": container with ID starting with cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.785014 4804 scope.go:117] "RemoveContainer" containerID="0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.785386 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96"} err="failed to get container status \"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96\": rpc error: code = NotFound desc = could not find container \"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96\": container with ID starting with 0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.785484 4804 scope.go:117] "RemoveContainer" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.785868 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe"} err="failed to get container status \"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\": rpc error: code = NotFound desc = could not find container \"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\": container with ID starting with 6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.785892 4804 scope.go:117] "RemoveContainer" containerID="d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.786138 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd"} err="failed to get container status \"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\": rpc error: code = NotFound desc = could not find container \"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\": container with ID starting with d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.786320 4804 scope.go:117] "RemoveContainer" containerID="cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.787384 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054"} err="failed to get container status \"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\": rpc error: code = NotFound desc = could not find container \"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\": container with ID starting with cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.787408 4804 scope.go:117] "RemoveContainer" containerID="d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.787730 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb"} err="failed to get container status \"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\": rpc error: code = NotFound desc = could not find container \"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\": container with ID starting with d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.787762 4804 scope.go:117] "RemoveContainer" containerID="6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.788225 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7"} err="failed to get container status \"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\": rpc error: code = NotFound desc = could not find container \"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\": container with ID starting with 6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.788255 4804 scope.go:117] "RemoveContainer" containerID="94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.788543 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb"} err="failed to get container status \"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\": rpc error: code = NotFound desc = could not find container \"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\": container with ID starting with 94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.788631 4804 scope.go:117] "RemoveContainer" containerID="8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.789049 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449"} err="failed to get container status \"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\": rpc error: code = NotFound desc = could not find container \"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\": container with ID starting with 8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.789069 4804 scope.go:117] "RemoveContainer" containerID="6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.789431 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79"} err="failed to get container status \"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\": rpc error: code = NotFound desc = could not find container \"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\": container with ID starting with 6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.789532 4804 scope.go:117] "RemoveContainer" containerID="cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.789934 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44"} err="failed to get container status \"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\": rpc error: code = NotFound desc = could not find container \"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\": container with ID starting with cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.789957 4804 scope.go:117] "RemoveContainer" containerID="0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.790328 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96"} err="failed to get container status \"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96\": rpc error: code = NotFound desc = could not find container \"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96\": container with ID starting with 0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.790438 4804 scope.go:117] "RemoveContainer" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.790776 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe"} err="failed to get container status \"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\": rpc error: code = NotFound desc = could not find container \"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\": container with ID starting with 6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.790802 4804 scope.go:117] "RemoveContainer" containerID="d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.791024 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd"} err="failed to get container status \"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\": rpc error: code = NotFound desc = could not find container \"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\": container with ID starting with d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.791044 4804 scope.go:117] "RemoveContainer" containerID="cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.791313 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054"} err="failed to get container status \"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\": rpc error: code = NotFound desc = could not find container \"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\": container with ID starting with cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.791345 4804 scope.go:117] "RemoveContainer" containerID="d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.791756 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb"} err="failed to get container status \"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\": rpc error: code = NotFound desc = could not find container \"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\": container with ID starting with d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.791804 4804 scope.go:117] "RemoveContainer" containerID="6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.792130 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7"} err="failed to get container status \"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\": rpc error: code = NotFound desc = could not find container \"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\": container with ID starting with 6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.792156 4804 scope.go:117] "RemoveContainer" containerID="94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.792511 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb"} err="failed to get container status \"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\": rpc error: code = NotFound desc = could not find container \"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\": container with ID starting with 94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.792533 4804 scope.go:117] "RemoveContainer" containerID="8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.792825 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449"} err="failed to get container status \"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\": rpc error: code = NotFound desc = could not find container \"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\": container with ID starting with 8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.792845 4804 scope.go:117] "RemoveContainer" containerID="6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.793187 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79"} err="failed to get container status \"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\": rpc error: code = NotFound desc = could not find container \"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\": container with ID starting with 6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.793277 4804 scope.go:117] "RemoveContainer" containerID="cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.793623 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44"} err="failed to get container status \"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\": rpc error: code = NotFound desc = could not find container \"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\": container with ID starting with cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.793648 4804 scope.go:117] "RemoveContainer" containerID="0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.793847 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96"} err="failed to get container status \"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96\": rpc error: code = NotFound desc = could not find container \"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96\": container with ID starting with 0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.793875 4804 scope.go:117] "RemoveContainer" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.794158 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe"} err="failed to get container status \"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\": rpc error: code = NotFound desc = could not find container \"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\": container with ID starting with 6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.794182 4804 scope.go:117] "RemoveContainer" containerID="d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.794465 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd"} err="failed to get container status \"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\": rpc error: code = NotFound desc = could not find container \"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\": container with ID starting with d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.794486 4804 scope.go:117] "RemoveContainer" containerID="cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.794702 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054"} err="failed to get container status \"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\": rpc error: code = NotFound desc = could not find container \"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\": container with ID starting with cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.794722 4804 scope.go:117] "RemoveContainer" containerID="d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.795025 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb"} err="failed to get container status \"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\": rpc error: code = NotFound desc = could not find container \"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\": container with ID starting with d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.795045 4804 scope.go:117] "RemoveContainer" containerID="6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.795308 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7"} err="failed to get container status \"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\": rpc error: code = NotFound desc = could not find container \"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\": container with ID starting with 6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.795329 4804 scope.go:117] "RemoveContainer" containerID="94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.795573 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb"} err="failed to get container status \"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\": rpc error: code = NotFound desc = could not find container \"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\": container with ID starting with 94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.795593 4804 scope.go:117] "RemoveContainer" containerID="8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.795830 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449"} err="failed to get container status \"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\": rpc error: code = NotFound desc = could not find container \"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\": container with ID starting with 8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.795850 4804 scope.go:117] "RemoveContainer" containerID="6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.796117 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79"} err="failed to get container status \"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\": rpc error: code = NotFound desc = could not find container \"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\": container with ID starting with 6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.796142 4804 scope.go:117] "RemoveContainer" containerID="cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.796387 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44"} err="failed to get container status \"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\": rpc error: code = NotFound desc = could not find container \"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\": container with ID starting with cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.796406 4804 scope.go:117] "RemoveContainer" containerID="0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.796647 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96"} err="failed to get container status \"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96\": rpc error: code = NotFound desc = could not find container \"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96\": container with ID starting with 0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96 not found: ID does not exist" Feb 17 13:39:33 crc kubenswrapper[4804]: I0217 13:39:33.591314 4804 generic.go:334] "Generic (PLEG): container finished" podID="33fa0baa-0a4a-41c5-976e-5c7f60828272" containerID="ba73a264d7f8563e0e9c4d40dbc9af839c153a50177c18a61830fc5c8a477ad1" exitCode=0 Feb 17 13:39:33 crc kubenswrapper[4804]: I0217 13:39:33.591360 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" event={"ID":"33fa0baa-0a4a-41c5-976e-5c7f60828272","Type":"ContainerDied","Data":"ba73a264d7f8563e0e9c4d40dbc9af839c153a50177c18a61830fc5c8a477ad1"} Feb 17 13:39:33 crc kubenswrapper[4804]: I0217 13:39:33.591514 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" event={"ID":"33fa0baa-0a4a-41c5-976e-5c7f60828272","Type":"ContainerStarted","Data":"1f7802bb10116c11f42c676d5fbd1cacfd87b9d9cf4ea7f8b5ddab719e593d62"} Feb 17 13:39:33 crc kubenswrapper[4804]: I0217 13:39:33.598448 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/2.log" Feb 17 13:39:33 crc kubenswrapper[4804]: I0217 13:39:33.599023 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/1.log" Feb 17 13:39:33 crc kubenswrapper[4804]: I0217 13:39:33.599064 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kclvs" event={"ID":"42eec48d-c990-43e6-8348-d9f78997ec3b","Type":"ContainerStarted","Data":"ea4800310f7d1b18b67c74de6a007247d05f79d7da23b317f43e391c1b3ebb1f"} Feb 17 13:39:34 crc kubenswrapper[4804]: I0217 13:39:34.582547 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" path="/var/lib/kubelet/pods/8df4e52a-e578-472b-a6b3-418e9755714f/volumes" Feb 17 13:39:34 crc kubenswrapper[4804]: I0217 13:39:34.608170 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" event={"ID":"33fa0baa-0a4a-41c5-976e-5c7f60828272","Type":"ContainerStarted","Data":"1120b3852c9eb0d23a2cfc95af8fd714b16650469fba10364cb91d2c99098fa4"} Feb 17 13:39:34 crc kubenswrapper[4804]: I0217 13:39:34.608252 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" event={"ID":"33fa0baa-0a4a-41c5-976e-5c7f60828272","Type":"ContainerStarted","Data":"7e8cd1a8a21365acd0af75a8f75b47e048e01a2a3d2e2a3931a0be35f83db943"} Feb 17 13:39:34 crc kubenswrapper[4804]: I0217 13:39:34.608281 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" event={"ID":"33fa0baa-0a4a-41c5-976e-5c7f60828272","Type":"ContainerStarted","Data":"1eb432af7f00483ec2a113397eddcf290bb0a042d015c649a88ce1f5abae770e"} Feb 17 13:39:34 crc kubenswrapper[4804]: I0217 13:39:34.608300 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" event={"ID":"33fa0baa-0a4a-41c5-976e-5c7f60828272","Type":"ContainerStarted","Data":"8cedcbbd7a320662899aa0d730be4be04bd5efd746aef8638e64f29498523077"} Feb 17 13:39:34 crc kubenswrapper[4804]: I0217 13:39:34.608319 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" event={"ID":"33fa0baa-0a4a-41c5-976e-5c7f60828272","Type":"ContainerStarted","Data":"9c32de04547fb45496c5f3adb3d2e0cdde1fee35c4b4cfd2943f18c3470e5fb7"} Feb 17 13:39:34 crc kubenswrapper[4804]: I0217 13:39:34.608338 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" event={"ID":"33fa0baa-0a4a-41c5-976e-5c7f60828272","Type":"ContainerStarted","Data":"94fb2d2ab20bd81235258da1fa469aa591bf35c3d3e5b4c0f9f7d28010aaea5d"} Feb 17 13:39:36 crc kubenswrapper[4804]: I0217 13:39:36.628559 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" event={"ID":"33fa0baa-0a4a-41c5-976e-5c7f60828272","Type":"ContainerStarted","Data":"ffbfe501ece81f184e3f2ce45658eed4f8324c2345bec728ac7d73c042a28e18"} Feb 17 13:39:39 crc kubenswrapper[4804]: I0217 13:39:39.648151 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" event={"ID":"33fa0baa-0a4a-41c5-976e-5c7f60828272","Type":"ContainerStarted","Data":"790c594836962af94d21c4cf97cbf6eb00279642f06f4954e2d0e2b343b1b9e2"} Feb 17 13:39:39 crc kubenswrapper[4804]: I0217 13:39:39.648725 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:39 crc kubenswrapper[4804]: I0217 13:39:39.648799 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:39 crc kubenswrapper[4804]: I0217 13:39:39.648867 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:39 crc kubenswrapper[4804]: I0217 13:39:39.673431 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:39 crc kubenswrapper[4804]: I0217 13:39:39.675265 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" podStartSLOduration=7.675251498 podStartE2EDuration="7.675251498s" podCreationTimestamp="2026-02-17 13:39:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:39:39.674355949 +0000 UTC m=+853.785775296" watchObservedRunningTime="2026-02-17 13:39:39.675251498 +0000 UTC m=+853.786670835" Feb 17 13:39:39 crc kubenswrapper[4804]: I0217 13:39:39.685249 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:40:02 crc kubenswrapper[4804]: I0217 13:40:02.793344 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.543114 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d"] Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.544805 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.551003 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d"] Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.552699 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.648258 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.648341 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.648402 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdpsw\" (UniqueName: \"kubernetes.io/projected/17c12921-34cb-4c2e-9cb8-585348e46d30-kube-api-access-pdpsw\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.749408 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.749719 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.749827 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdpsw\" (UniqueName: \"kubernetes.io/projected/17c12921-34cb-4c2e-9cb8-585348e46d30-kube-api-access-pdpsw\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.749916 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.750114 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.768244 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdpsw\" (UniqueName: \"kubernetes.io/projected/17c12921-34cb-4c2e-9cb8-585348e46d30-kube-api-access-pdpsw\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.862117 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:11 crc kubenswrapper[4804]: I0217 13:40:11.078125 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d"] Feb 17 13:40:11 crc kubenswrapper[4804]: W0217 13:40:11.084662 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17c12921_34cb_4c2e_9cb8_585348e46d30.slice/crio-add656ed6eb63f18068e31561558ccb808018fa2d060b3c0283ab98423605dc9 WatchSource:0}: Error finding container add656ed6eb63f18068e31561558ccb808018fa2d060b3c0283ab98423605dc9: Status 404 returned error can't find the container with id add656ed6eb63f18068e31561558ccb808018fa2d060b3c0283ab98423605dc9 Feb 17 13:40:11 crc kubenswrapper[4804]: I0217 13:40:11.847370 4804 generic.go:334] "Generic (PLEG): container finished" podID="17c12921-34cb-4c2e-9cb8-585348e46d30" containerID="68d055d48c0beb73384b28fec6312b4794a9551a75e4790dd6949b3936abdb4b" exitCode=0 Feb 17 13:40:11 crc kubenswrapper[4804]: I0217 13:40:11.847432 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" event={"ID":"17c12921-34cb-4c2e-9cb8-585348e46d30","Type":"ContainerDied","Data":"68d055d48c0beb73384b28fec6312b4794a9551a75e4790dd6949b3936abdb4b"} Feb 17 13:40:11 crc kubenswrapper[4804]: I0217 13:40:11.847492 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" event={"ID":"17c12921-34cb-4c2e-9cb8-585348e46d30","Type":"ContainerStarted","Data":"add656ed6eb63f18068e31561558ccb808018fa2d060b3c0283ab98423605dc9"} Feb 17 13:40:12 crc kubenswrapper[4804]: I0217 13:40:12.730483 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m9764"] Feb 17 13:40:12 crc kubenswrapper[4804]: I0217 13:40:12.732326 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:12 crc kubenswrapper[4804]: I0217 13:40:12.736679 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m9764"] Feb 17 13:40:12 crc kubenswrapper[4804]: I0217 13:40:12.884352 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-utilities\") pod \"redhat-operators-m9764\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:12 crc kubenswrapper[4804]: I0217 13:40:12.884426 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp5hv\" (UniqueName: \"kubernetes.io/projected/360df31e-5543-40bc-a507-76ce8c336d42-kube-api-access-sp5hv\") pod \"redhat-operators-m9764\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:12 crc kubenswrapper[4804]: I0217 13:40:12.884471 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-catalog-content\") pod \"redhat-operators-m9764\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:12 crc kubenswrapper[4804]: I0217 13:40:12.985765 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-utilities\") pod \"redhat-operators-m9764\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:12 crc kubenswrapper[4804]: I0217 13:40:12.985850 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp5hv\" (UniqueName: \"kubernetes.io/projected/360df31e-5543-40bc-a507-76ce8c336d42-kube-api-access-sp5hv\") pod \"redhat-operators-m9764\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:12 crc kubenswrapper[4804]: I0217 13:40:12.985891 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-catalog-content\") pod \"redhat-operators-m9764\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:12 crc kubenswrapper[4804]: I0217 13:40:12.986304 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-utilities\") pod \"redhat-operators-m9764\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:12 crc kubenswrapper[4804]: I0217 13:40:12.986359 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-catalog-content\") pod \"redhat-operators-m9764\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:13 crc kubenswrapper[4804]: I0217 13:40:13.005589 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp5hv\" (UniqueName: \"kubernetes.io/projected/360df31e-5543-40bc-a507-76ce8c336d42-kube-api-access-sp5hv\") pod \"redhat-operators-m9764\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:13 crc kubenswrapper[4804]: I0217 13:40:13.095194 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:13 crc kubenswrapper[4804]: I0217 13:40:13.295250 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m9764"] Feb 17 13:40:13 crc kubenswrapper[4804]: W0217 13:40:13.360118 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod360df31e_5543_40bc_a507_76ce8c336d42.slice/crio-16a653f264128e5b4be27fbdc5d3721b713d9d72a60c5f9389716871b444025d WatchSource:0}: Error finding container 16a653f264128e5b4be27fbdc5d3721b713d9d72a60c5f9389716871b444025d: Status 404 returned error can't find the container with id 16a653f264128e5b4be27fbdc5d3721b713d9d72a60c5f9389716871b444025d Feb 17 13:40:13 crc kubenswrapper[4804]: I0217 13:40:13.862002 4804 generic.go:334] "Generic (PLEG): container finished" podID="17c12921-34cb-4c2e-9cb8-585348e46d30" containerID="18d43d9e0721b48fc2e9852ee763f7b840ccfa5191315513c3c6d7bb2544d362" exitCode=0 Feb 17 13:40:13 crc kubenswrapper[4804]: I0217 13:40:13.862072 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" event={"ID":"17c12921-34cb-4c2e-9cb8-585348e46d30","Type":"ContainerDied","Data":"18d43d9e0721b48fc2e9852ee763f7b840ccfa5191315513c3c6d7bb2544d362"} Feb 17 13:40:13 crc kubenswrapper[4804]: I0217 13:40:13.864266 4804 generic.go:334] "Generic (PLEG): container finished" podID="360df31e-5543-40bc-a507-76ce8c336d42" containerID="a3f0ddba8829031242e38c416e1b6947c00ac8f0dcde990e7a1bede911bbbcea" exitCode=0 Feb 17 13:40:13 crc kubenswrapper[4804]: I0217 13:40:13.864320 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9764" event={"ID":"360df31e-5543-40bc-a507-76ce8c336d42","Type":"ContainerDied","Data":"a3f0ddba8829031242e38c416e1b6947c00ac8f0dcde990e7a1bede911bbbcea"} Feb 17 13:40:13 crc kubenswrapper[4804]: I0217 13:40:13.864360 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9764" event={"ID":"360df31e-5543-40bc-a507-76ce8c336d42","Type":"ContainerStarted","Data":"16a653f264128e5b4be27fbdc5d3721b713d9d72a60c5f9389716871b444025d"} Feb 17 13:40:14 crc kubenswrapper[4804]: I0217 13:40:14.872665 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9764" event={"ID":"360df31e-5543-40bc-a507-76ce8c336d42","Type":"ContainerStarted","Data":"8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430"} Feb 17 13:40:14 crc kubenswrapper[4804]: I0217 13:40:14.876071 4804 generic.go:334] "Generic (PLEG): container finished" podID="17c12921-34cb-4c2e-9cb8-585348e46d30" containerID="06729888763ca9ac544294c550c944e52dc5e36bdb1fa6f8de896feb8f6c3556" exitCode=0 Feb 17 13:40:14 crc kubenswrapper[4804]: I0217 13:40:14.876111 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" event={"ID":"17c12921-34cb-4c2e-9cb8-585348e46d30","Type":"ContainerDied","Data":"06729888763ca9ac544294c550c944e52dc5e36bdb1fa6f8de896feb8f6c3556"} Feb 17 13:40:15 crc kubenswrapper[4804]: I0217 13:40:15.887308 4804 generic.go:334] "Generic (PLEG): container finished" podID="360df31e-5543-40bc-a507-76ce8c336d42" containerID="8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430" exitCode=0 Feb 17 13:40:15 crc kubenswrapper[4804]: I0217 13:40:15.887447 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9764" event={"ID":"360df31e-5543-40bc-a507-76ce8c336d42","Type":"ContainerDied","Data":"8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430"} Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.241234 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.338433 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdpsw\" (UniqueName: \"kubernetes.io/projected/17c12921-34cb-4c2e-9cb8-585348e46d30-kube-api-access-pdpsw\") pod \"17c12921-34cb-4c2e-9cb8-585348e46d30\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.338538 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-bundle\") pod \"17c12921-34cb-4c2e-9cb8-585348e46d30\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.338604 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-util\") pod \"17c12921-34cb-4c2e-9cb8-585348e46d30\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.339965 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-bundle" (OuterVolumeSpecName: "bundle") pod "17c12921-34cb-4c2e-9cb8-585348e46d30" (UID: "17c12921-34cb-4c2e-9cb8-585348e46d30"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.344950 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17c12921-34cb-4c2e-9cb8-585348e46d30-kube-api-access-pdpsw" (OuterVolumeSpecName: "kube-api-access-pdpsw") pod "17c12921-34cb-4c2e-9cb8-585348e46d30" (UID: "17c12921-34cb-4c2e-9cb8-585348e46d30"). InnerVolumeSpecName "kube-api-access-pdpsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.353953 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-util" (OuterVolumeSpecName: "util") pod "17c12921-34cb-4c2e-9cb8-585348e46d30" (UID: "17c12921-34cb-4c2e-9cb8-585348e46d30"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.440716 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdpsw\" (UniqueName: \"kubernetes.io/projected/17c12921-34cb-4c2e-9cb8-585348e46d30-kube-api-access-pdpsw\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.440759 4804 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.440773 4804 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-util\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.898768 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.898781 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" event={"ID":"17c12921-34cb-4c2e-9cb8-585348e46d30","Type":"ContainerDied","Data":"add656ed6eb63f18068e31561558ccb808018fa2d060b3c0283ab98423605dc9"} Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.899577 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="add656ed6eb63f18068e31561558ccb808018fa2d060b3c0283ab98423605dc9" Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.904238 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9764" event={"ID":"360df31e-5543-40bc-a507-76ce8c336d42","Type":"ContainerStarted","Data":"6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2"} Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.927350 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m9764" podStartSLOduration=2.449612494 podStartE2EDuration="4.9273252s" podCreationTimestamp="2026-02-17 13:40:12 +0000 UTC" firstStartedPulling="2026-02-17 13:40:13.86593815 +0000 UTC m=+887.977357497" lastFinishedPulling="2026-02-17 13:40:16.343650856 +0000 UTC m=+890.455070203" observedRunningTime="2026-02-17 13:40:16.924635616 +0000 UTC m=+891.036054983" watchObservedRunningTime="2026-02-17 13:40:16.9273252 +0000 UTC m=+891.038744577" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.217881 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-rkf7s"] Feb 17 13:40:18 crc kubenswrapper[4804]: E0217 13:40:18.218095 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17c12921-34cb-4c2e-9cb8-585348e46d30" containerName="util" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.218106 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="17c12921-34cb-4c2e-9cb8-585348e46d30" containerName="util" Feb 17 13:40:18 crc kubenswrapper[4804]: E0217 13:40:18.218117 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17c12921-34cb-4c2e-9cb8-585348e46d30" containerName="extract" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.218122 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="17c12921-34cb-4c2e-9cb8-585348e46d30" containerName="extract" Feb 17 13:40:18 crc kubenswrapper[4804]: E0217 13:40:18.218136 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17c12921-34cb-4c2e-9cb8-585348e46d30" containerName="pull" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.218142 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="17c12921-34cb-4c2e-9cb8-585348e46d30" containerName="pull" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.218250 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="17c12921-34cb-4c2e-9cb8-585348e46d30" containerName="extract" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.218623 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-rkf7s" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.220795 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.221130 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.221348 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-qjhp9" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.231379 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-rkf7s"] Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.238321 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nskmj\" (UniqueName: \"kubernetes.io/projected/2789dcb9-5619-4986-a692-1eec733c97ff-kube-api-access-nskmj\") pod \"nmstate-operator-694c9596b7-rkf7s\" (UID: \"2789dcb9-5619-4986-a692-1eec733c97ff\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-rkf7s" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.339397 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nskmj\" (UniqueName: \"kubernetes.io/projected/2789dcb9-5619-4986-a692-1eec733c97ff-kube-api-access-nskmj\") pod \"nmstate-operator-694c9596b7-rkf7s\" (UID: \"2789dcb9-5619-4986-a692-1eec733c97ff\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-rkf7s" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.356945 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nskmj\" (UniqueName: \"kubernetes.io/projected/2789dcb9-5619-4986-a692-1eec733c97ff-kube-api-access-nskmj\") pod \"nmstate-operator-694c9596b7-rkf7s\" (UID: \"2789dcb9-5619-4986-a692-1eec733c97ff\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-rkf7s" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.533979 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-rkf7s" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.801154 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-rkf7s"] Feb 17 13:40:18 crc kubenswrapper[4804]: W0217 13:40:18.801544 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2789dcb9_5619_4986_a692_1eec733c97ff.slice/crio-d1116ffa2c4c1997ac70e9ce631b59a870b4642f589532dcd559cf475b2613fa WatchSource:0}: Error finding container d1116ffa2c4c1997ac70e9ce631b59a870b4642f589532dcd559cf475b2613fa: Status 404 returned error can't find the container with id d1116ffa2c4c1997ac70e9ce631b59a870b4642f589532dcd559cf475b2613fa Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.915279 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-rkf7s" event={"ID":"2789dcb9-5619-4986-a692-1eec733c97ff","Type":"ContainerStarted","Data":"d1116ffa2c4c1997ac70e9ce631b59a870b4642f589532dcd559cf475b2613fa"} Feb 17 13:40:20 crc kubenswrapper[4804]: I0217 13:40:20.919110 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m57vw"] Feb 17 13:40:20 crc kubenswrapper[4804]: I0217 13:40:20.920360 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:20 crc kubenswrapper[4804]: I0217 13:40:20.929682 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m57vw"] Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.015985 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-catalog-content\") pod \"redhat-marketplace-m57vw\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.016090 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-utilities\") pod \"redhat-marketplace-m57vw\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.016116 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvhln\" (UniqueName: \"kubernetes.io/projected/e364f68f-7e6e-4f69-8884-19064e2ab186-kube-api-access-mvhln\") pod \"redhat-marketplace-m57vw\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.116769 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-utilities\") pod \"redhat-marketplace-m57vw\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.116817 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvhln\" (UniqueName: \"kubernetes.io/projected/e364f68f-7e6e-4f69-8884-19064e2ab186-kube-api-access-mvhln\") pod \"redhat-marketplace-m57vw\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.116861 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-catalog-content\") pod \"redhat-marketplace-m57vw\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.117408 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-catalog-content\") pod \"redhat-marketplace-m57vw\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.117401 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-utilities\") pod \"redhat-marketplace-m57vw\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.140686 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvhln\" (UniqueName: \"kubernetes.io/projected/e364f68f-7e6e-4f69-8884-19064e2ab186-kube-api-access-mvhln\") pod \"redhat-marketplace-m57vw\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.236034 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.479295 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m57vw"] Feb 17 13:40:21 crc kubenswrapper[4804]: W0217 13:40:21.492311 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode364f68f_7e6e_4f69_8884_19064e2ab186.slice/crio-555bccc3d2d7e40f2defa172b9440725c7c1ecdee867fd2abee771e51caa9692 WatchSource:0}: Error finding container 555bccc3d2d7e40f2defa172b9440725c7c1ecdee867fd2abee771e51caa9692: Status 404 returned error can't find the container with id 555bccc3d2d7e40f2defa172b9440725c7c1ecdee867fd2abee771e51caa9692 Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.961207 4804 generic.go:334] "Generic (PLEG): container finished" podID="e364f68f-7e6e-4f69-8884-19064e2ab186" containerID="a58356e342b8d1a0c197b929d754c94eace180ca8295bdab19e683e521269b3f" exitCode=0 Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.961292 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m57vw" event={"ID":"e364f68f-7e6e-4f69-8884-19064e2ab186","Type":"ContainerDied","Data":"a58356e342b8d1a0c197b929d754c94eace180ca8295bdab19e683e521269b3f"} Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.961336 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m57vw" event={"ID":"e364f68f-7e6e-4f69-8884-19064e2ab186","Type":"ContainerStarted","Data":"555bccc3d2d7e40f2defa172b9440725c7c1ecdee867fd2abee771e51caa9692"} Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.962502 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-rkf7s" event={"ID":"2789dcb9-5619-4986-a692-1eec733c97ff","Type":"ContainerStarted","Data":"42d70f2666785e518cfdf425959617cb4a8bf3f12a5125e26182af3c2af1ec42"} Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.996551 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-rkf7s" podStartSLOduration=1.414879656 podStartE2EDuration="3.996534006s" podCreationTimestamp="2026-02-17 13:40:18 +0000 UTC" firstStartedPulling="2026-02-17 13:40:18.80375031 +0000 UTC m=+892.915169647" lastFinishedPulling="2026-02-17 13:40:21.38540466 +0000 UTC m=+895.496823997" observedRunningTime="2026-02-17 13:40:21.995614807 +0000 UTC m=+896.107034154" watchObservedRunningTime="2026-02-17 13:40:21.996534006 +0000 UTC m=+896.107953343" Feb 17 13:40:22 crc kubenswrapper[4804]: I0217 13:40:22.952995 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz"] Feb 17 13:40:22 crc kubenswrapper[4804]: I0217 13:40:22.954370 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz" Feb 17 13:40:22 crc kubenswrapper[4804]: I0217 13:40:22.956258 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-tslwf" Feb 17 13:40:22 crc kubenswrapper[4804]: I0217 13:40:22.969499 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz"] Feb 17 13:40:22 crc kubenswrapper[4804]: I0217 13:40:22.970244 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" Feb 17 13:40:22 crc kubenswrapper[4804]: I0217 13:40:22.976561 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 17 13:40:22 crc kubenswrapper[4804]: I0217 13:40:22.977021 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz"] Feb 17 13:40:22 crc kubenswrapper[4804]: I0217 13:40:22.982695 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m57vw" event={"ID":"e364f68f-7e6e-4f69-8884-19064e2ab186","Type":"ContainerStarted","Data":"4c6c05689b4d8003c577d2fa36fd3fe297914eaa29a6a636dc47b237ac9d795d"} Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.009137 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz"] Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.016586 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-jxn7r"] Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.017436 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.091346 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w"] Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.092007 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.095603 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.095818 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.095827 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.096165 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.096845 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-hb7n4" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.100878 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w"] Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.140042 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6prfp\" (UniqueName: \"kubernetes.io/projected/36fd4ae3-048e-4e51-b2fa-875a5c84b8e0-kube-api-access-6prfp\") pod \"nmstate-webhook-866bcb46dc-dbfqz\" (UID: \"36fd4ae3-048e-4e51-b2fa-875a5c84b8e0\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.140114 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbmh5\" (UniqueName: \"kubernetes.io/projected/81e46a71-360c-4509-ad38-2b2c814a56c2-kube-api-access-xbmh5\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.140138 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/81e46a71-360c-4509-ad38-2b2c814a56c2-dbus-socket\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.140179 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54gfw\" (UniqueName: \"kubernetes.io/projected/18e3c061-8633-471f-b2ab-e87e3c0b5d44-kube-api-access-54gfw\") pod \"nmstate-metrics-58c85c668d-8gkbz\" (UID: \"18e3c061-8633-471f-b2ab-e87e3c0b5d44\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.140223 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/81e46a71-360c-4509-ad38-2b2c814a56c2-nmstate-lock\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.140236 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/81e46a71-360c-4509-ad38-2b2c814a56c2-ovs-socket\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.140253 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/36fd4ae3-048e-4e51-b2fa-875a5c84b8e0-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-dbfqz\" (UID: \"36fd4ae3-048e-4e51-b2fa-875a5c84b8e0\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.241292 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6prfp\" (UniqueName: \"kubernetes.io/projected/36fd4ae3-048e-4e51-b2fa-875a5c84b8e0-kube-api-access-6prfp\") pod \"nmstate-webhook-866bcb46dc-dbfqz\" (UID: \"36fd4ae3-048e-4e51-b2fa-875a5c84b8e0\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.241347 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbmh5\" (UniqueName: \"kubernetes.io/projected/81e46a71-360c-4509-ad38-2b2c814a56c2-kube-api-access-xbmh5\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.241381 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/81e46a71-360c-4509-ad38-2b2c814a56c2-dbus-socket\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.241409 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2158c202-5aa4-47aa-87a1-73e4b9043e78-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-bgf7w\" (UID: \"2158c202-5aa4-47aa-87a1-73e4b9043e78\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.241441 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhrn6\" (UniqueName: \"kubernetes.io/projected/2158c202-5aa4-47aa-87a1-73e4b9043e78-kube-api-access-jhrn6\") pod \"nmstate-console-plugin-5c78fc5d65-bgf7w\" (UID: \"2158c202-5aa4-47aa-87a1-73e4b9043e78\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.241461 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2158c202-5aa4-47aa-87a1-73e4b9043e78-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-bgf7w\" (UID: \"2158c202-5aa4-47aa-87a1-73e4b9043e78\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.241483 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54gfw\" (UniqueName: \"kubernetes.io/projected/18e3c061-8633-471f-b2ab-e87e3c0b5d44-kube-api-access-54gfw\") pod \"nmstate-metrics-58c85c668d-8gkbz\" (UID: \"18e3c061-8633-471f-b2ab-e87e3c0b5d44\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.241503 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/81e46a71-360c-4509-ad38-2b2c814a56c2-ovs-socket\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.241517 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/81e46a71-360c-4509-ad38-2b2c814a56c2-nmstate-lock\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.241534 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/36fd4ae3-048e-4e51-b2fa-875a5c84b8e0-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-dbfqz\" (UID: \"36fd4ae3-048e-4e51-b2fa-875a5c84b8e0\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" Feb 17 13:40:23 crc kubenswrapper[4804]: E0217 13:40:23.241653 4804 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 17 13:40:23 crc kubenswrapper[4804]: E0217 13:40:23.241704 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36fd4ae3-048e-4e51-b2fa-875a5c84b8e0-tls-key-pair podName:36fd4ae3-048e-4e51-b2fa-875a5c84b8e0 nodeName:}" failed. No retries permitted until 2026-02-17 13:40:23.741686338 +0000 UTC m=+897.853105675 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/36fd4ae3-048e-4e51-b2fa-875a5c84b8e0-tls-key-pair") pod "nmstate-webhook-866bcb46dc-dbfqz" (UID: "36fd4ae3-048e-4e51-b2fa-875a5c84b8e0") : secret "openshift-nmstate-webhook" not found Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.242018 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/81e46a71-360c-4509-ad38-2b2c814a56c2-dbus-socket\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.242023 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/81e46a71-360c-4509-ad38-2b2c814a56c2-ovs-socket\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.242040 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/81e46a71-360c-4509-ad38-2b2c814a56c2-nmstate-lock\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.265956 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54gfw\" (UniqueName: \"kubernetes.io/projected/18e3c061-8633-471f-b2ab-e87e3c0b5d44-kube-api-access-54gfw\") pod \"nmstate-metrics-58c85c668d-8gkbz\" (UID: \"18e3c061-8633-471f-b2ab-e87e3c0b5d44\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.274833 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6prfp\" (UniqueName: \"kubernetes.io/projected/36fd4ae3-048e-4e51-b2fa-875a5c84b8e0-kube-api-access-6prfp\") pod \"nmstate-webhook-866bcb46dc-dbfqz\" (UID: \"36fd4ae3-048e-4e51-b2fa-875a5c84b8e0\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.279954 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbmh5\" (UniqueName: \"kubernetes.io/projected/81e46a71-360c-4509-ad38-2b2c814a56c2-kube-api-access-xbmh5\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.306961 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.315372 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-664d7fb4-tx445"] Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.315997 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.331814 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-664d7fb4-tx445"] Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.342405 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2158c202-5aa4-47aa-87a1-73e4b9043e78-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-bgf7w\" (UID: \"2158c202-5aa4-47aa-87a1-73e4b9043e78\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.342453 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhrn6\" (UniqueName: \"kubernetes.io/projected/2158c202-5aa4-47aa-87a1-73e4b9043e78-kube-api-access-jhrn6\") pod \"nmstate-console-plugin-5c78fc5d65-bgf7w\" (UID: \"2158c202-5aa4-47aa-87a1-73e4b9043e78\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.342473 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2158c202-5aa4-47aa-87a1-73e4b9043e78-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-bgf7w\" (UID: \"2158c202-5aa4-47aa-87a1-73e4b9043e78\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:23 crc kubenswrapper[4804]: E0217 13:40:23.343357 4804 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 17 13:40:23 crc kubenswrapper[4804]: E0217 13:40:23.343412 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2158c202-5aa4-47aa-87a1-73e4b9043e78-plugin-serving-cert podName:2158c202-5aa4-47aa-87a1-73e4b9043e78 nodeName:}" failed. No retries permitted until 2026-02-17 13:40:23.843399201 +0000 UTC m=+897.954818538 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/2158c202-5aa4-47aa-87a1-73e4b9043e78-plugin-serving-cert") pod "nmstate-console-plugin-5c78fc5d65-bgf7w" (UID: "2158c202-5aa4-47aa-87a1-73e4b9043e78") : secret "plugin-serving-cert" not found Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.343356 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2158c202-5aa4-47aa-87a1-73e4b9043e78-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-bgf7w\" (UID: \"2158c202-5aa4-47aa-87a1-73e4b9043e78\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.363657 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhrn6\" (UniqueName: \"kubernetes.io/projected/2158c202-5aa4-47aa-87a1-73e4b9043e78-kube-api-access-jhrn6\") pod \"nmstate-console-plugin-5c78fc5d65-bgf7w\" (UID: \"2158c202-5aa4-47aa-87a1-73e4b9043e78\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.376326 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: W0217 13:40:23.406262 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81e46a71_360c_4509_ad38_2b2c814a56c2.slice/crio-4e562ae07992c27e95ae60ad52fa3c88ff88ed29c452794cce3b4f066dd6e271 WatchSource:0}: Error finding container 4e562ae07992c27e95ae60ad52fa3c88ff88ed29c452794cce3b4f066dd6e271: Status 404 returned error can't find the container with id 4e562ae07992c27e95ae60ad52fa3c88ff88ed29c452794cce3b4f066dd6e271 Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.443800 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-service-ca\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.443849 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-oauth-serving-cert\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.443874 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-trusted-ca-bundle\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.443901 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/60243734-ea5d-4197-bb21-b278641ce101-console-oauth-config\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.443919 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/60243734-ea5d-4197-bb21-b278641ce101-console-serving-cert\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.443942 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-console-config\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.444047 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdzzh\" (UniqueName: \"kubernetes.io/projected/60243734-ea5d-4197-bb21-b278641ce101-kube-api-access-pdzzh\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.545646 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdzzh\" (UniqueName: \"kubernetes.io/projected/60243734-ea5d-4197-bb21-b278641ce101-kube-api-access-pdzzh\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.545688 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-service-ca\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.545711 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-oauth-serving-cert\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.545737 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-trusted-ca-bundle\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.545767 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/60243734-ea5d-4197-bb21-b278641ce101-console-oauth-config\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.545788 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/60243734-ea5d-4197-bb21-b278641ce101-console-serving-cert\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.545816 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-console-config\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.546834 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-oauth-serving-cert\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.546836 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-service-ca\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.546977 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-console-config\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.546983 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-trusted-ca-bundle\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.549764 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/60243734-ea5d-4197-bb21-b278641ce101-console-oauth-config\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.549763 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/60243734-ea5d-4197-bb21-b278641ce101-console-serving-cert\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.564878 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdzzh\" (UniqueName: \"kubernetes.io/projected/60243734-ea5d-4197-bb21-b278641ce101-kube-api-access-pdzzh\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.715807 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:23.816889 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/36fd4ae3-048e-4e51-b2fa-875a5c84b8e0-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-dbfqz\" (UID: \"36fd4ae3-048e-4e51-b2fa-875a5c84b8e0\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:23.821333 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/36fd4ae3-048e-4e51-b2fa-875a5c84b8e0-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-dbfqz\" (UID: \"36fd4ae3-048e-4e51-b2fa-875a5c84b8e0\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:23.917022 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:23.918122 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2158c202-5aa4-47aa-87a1-73e4b9043e78-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-bgf7w\" (UID: \"2158c202-5aa4-47aa-87a1-73e4b9043e78\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:23.921409 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2158c202-5aa4-47aa-87a1-73e4b9043e78-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-bgf7w\" (UID: \"2158c202-5aa4-47aa-87a1-73e4b9043e78\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:24.003264 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jxn7r" event={"ID":"81e46a71-360c-4509-ad38-2b2c814a56c2","Type":"ContainerStarted","Data":"4e562ae07992c27e95ae60ad52fa3c88ff88ed29c452794cce3b4f066dd6e271"} Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:24.008882 4804 generic.go:334] "Generic (PLEG): container finished" podID="e364f68f-7e6e-4f69-8884-19064e2ab186" containerID="4c6c05689b4d8003c577d2fa36fd3fe297914eaa29a6a636dc47b237ac9d795d" exitCode=0 Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:24.008939 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m57vw" event={"ID":"e364f68f-7e6e-4f69-8884-19064e2ab186","Type":"ContainerDied","Data":"4c6c05689b4d8003c577d2fa36fd3fe297914eaa29a6a636dc47b237ac9d795d"} Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:24.031162 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:24.180778 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m9764" podUID="360df31e-5543-40bc-a507-76ce8c336d42" containerName="registry-server" probeResult="failure" output=< Feb 17 13:40:24 crc kubenswrapper[4804]: timeout: failed to connect service ":50051" within 1s Feb 17 13:40:24 crc kubenswrapper[4804]: > Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:24.917019 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz"] Feb 17 13:40:24 crc kubenswrapper[4804]: W0217 13:40:24.923882 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36fd4ae3_048e_4e51_b2fa_875a5c84b8e0.slice/crio-45494291fea5bd3e5eb853983ba116cc565cd64cbe47f955397a881c761c35a4 WatchSource:0}: Error finding container 45494291fea5bd3e5eb853983ba116cc565cd64cbe47f955397a881c761c35a4: Status 404 returned error can't find the container with id 45494291fea5bd3e5eb853983ba116cc565cd64cbe47f955397a881c761c35a4 Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:24.923971 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w"] Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:24.933089 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz"] Feb 17 13:40:24 crc kubenswrapper[4804]: W0217 13:40:24.940511 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2158c202_5aa4_47aa_87a1_73e4b9043e78.slice/crio-0efd4639dcfc5a591a8986a76edabe50af15fdb2fdd8dd5fe7e17a2b498c5133 WatchSource:0}: Error finding container 0efd4639dcfc5a591a8986a76edabe50af15fdb2fdd8dd5fe7e17a2b498c5133: Status 404 returned error can't find the container with id 0efd4639dcfc5a591a8986a76edabe50af15fdb2fdd8dd5fe7e17a2b498c5133 Feb 17 13:40:24 crc kubenswrapper[4804]: W0217 13:40:24.941694 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60243734_ea5d_4197_bb21_b278641ce101.slice/crio-9335d26640ddcadd8362aa79b933fcb7772c51b9729fff2c800b7c9b8c9f51cc WatchSource:0}: Error finding container 9335d26640ddcadd8362aa79b933fcb7772c51b9729fff2c800b7c9b8c9f51cc: Status 404 returned error can't find the container with id 9335d26640ddcadd8362aa79b933fcb7772c51b9729fff2c800b7c9b8c9f51cc Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:24.947067 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-664d7fb4-tx445"] Feb 17 13:40:24 crc kubenswrapper[4804]: W0217 13:40:24.950812 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18e3c061_8633_471f_b2ab_e87e3c0b5d44.slice/crio-914e17fae991465764dc2e7a640446413fc2a0859bae032a3d56e04df3246b26 WatchSource:0}: Error finding container 914e17fae991465764dc2e7a640446413fc2a0859bae032a3d56e04df3246b26: Status 404 returned error can't find the container with id 914e17fae991465764dc2e7a640446413fc2a0859bae032a3d56e04df3246b26 Feb 17 13:40:25 crc kubenswrapper[4804]: I0217 13:40:25.015945 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz" event={"ID":"18e3c061-8633-471f-b2ab-e87e3c0b5d44","Type":"ContainerStarted","Data":"914e17fae991465764dc2e7a640446413fc2a0859bae032a3d56e04df3246b26"} Feb 17 13:40:25 crc kubenswrapper[4804]: I0217 13:40:25.017022 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" event={"ID":"36fd4ae3-048e-4e51-b2fa-875a5c84b8e0","Type":"ContainerStarted","Data":"45494291fea5bd3e5eb853983ba116cc565cd64cbe47f955397a881c761c35a4"} Feb 17 13:40:25 crc kubenswrapper[4804]: I0217 13:40:25.018460 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-664d7fb4-tx445" event={"ID":"60243734-ea5d-4197-bb21-b278641ce101","Type":"ContainerStarted","Data":"9335d26640ddcadd8362aa79b933fcb7772c51b9729fff2c800b7c9b8c9f51cc"} Feb 17 13:40:25 crc kubenswrapper[4804]: I0217 13:40:25.019789 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" event={"ID":"2158c202-5aa4-47aa-87a1-73e4b9043e78","Type":"ContainerStarted","Data":"0efd4639dcfc5a591a8986a76edabe50af15fdb2fdd8dd5fe7e17a2b498c5133"} Feb 17 13:40:25 crc kubenswrapper[4804]: I0217 13:40:25.022211 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m57vw" event={"ID":"e364f68f-7e6e-4f69-8884-19064e2ab186","Type":"ContainerStarted","Data":"936d92768f8545882fd9f589c352b0f3e05694fdb88b93635d612b3de2273f31"} Feb 17 13:40:25 crc kubenswrapper[4804]: I0217 13:40:25.046775 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m57vw" podStartSLOduration=2.568216374 podStartE2EDuration="5.046755747s" podCreationTimestamp="2026-02-17 13:40:20 +0000 UTC" firstStartedPulling="2026-02-17 13:40:21.96384416 +0000 UTC m=+896.075263507" lastFinishedPulling="2026-02-17 13:40:24.442383543 +0000 UTC m=+898.553802880" observedRunningTime="2026-02-17 13:40:25.041834632 +0000 UTC m=+899.153253969" watchObservedRunningTime="2026-02-17 13:40:25.046755747 +0000 UTC m=+899.158175104" Feb 17 13:40:26 crc kubenswrapper[4804]: I0217 13:40:26.030660 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-664d7fb4-tx445" event={"ID":"60243734-ea5d-4197-bb21-b278641ce101","Type":"ContainerStarted","Data":"d029fc37d28f0fc9736310fd30cf0f3429d4e245f42e37fc11827000f84680bc"} Feb 17 13:40:26 crc kubenswrapper[4804]: I0217 13:40:26.050578 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-664d7fb4-tx445" podStartSLOduration=3.050557821 podStartE2EDuration="3.050557821s" podCreationTimestamp="2026-02-17 13:40:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:40:26.047495295 +0000 UTC m=+900.158914632" watchObservedRunningTime="2026-02-17 13:40:26.050557821 +0000 UTC m=+900.161977158" Feb 17 13:40:26 crc kubenswrapper[4804]: I0217 13:40:26.861050 4804 scope.go:117] "RemoveContainer" containerID="2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a" Feb 17 13:40:27 crc kubenswrapper[4804]: I0217 13:40:27.038352 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/2.log" Feb 17 13:40:28 crc kubenswrapper[4804]: I0217 13:40:28.058691 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jxn7r" event={"ID":"81e46a71-360c-4509-ad38-2b2c814a56c2","Type":"ContainerStarted","Data":"a961e2afe58699decd28bc4b065d6f984dea427bbcb5c3f9dc9ebee7cb470db6"} Feb 17 13:40:28 crc kubenswrapper[4804]: I0217 13:40:28.059441 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:28 crc kubenswrapper[4804]: I0217 13:40:28.060124 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz" event={"ID":"18e3c061-8633-471f-b2ab-e87e3c0b5d44","Type":"ContainerStarted","Data":"a9398a7bb08979780f96eaf493af71ddbd2a07b2910e645679824d3047d6cfc2"} Feb 17 13:40:28 crc kubenswrapper[4804]: I0217 13:40:28.069080 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" event={"ID":"36fd4ae3-048e-4e51-b2fa-875a5c84b8e0","Type":"ContainerStarted","Data":"49e7222b3b541d9076fae97cf42fc4edcaa1361f71fa5f57527a9556124e5884"} Feb 17 13:40:28 crc kubenswrapper[4804]: I0217 13:40:28.069357 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" Feb 17 13:40:28 crc kubenswrapper[4804]: I0217 13:40:28.080435 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-jxn7r" podStartSLOduration=2.559791299 podStartE2EDuration="6.080416697s" podCreationTimestamp="2026-02-17 13:40:22 +0000 UTC" firstStartedPulling="2026-02-17 13:40:23.415321129 +0000 UTC m=+897.526740466" lastFinishedPulling="2026-02-17 13:40:26.935946517 +0000 UTC m=+901.047365864" observedRunningTime="2026-02-17 13:40:28.072460857 +0000 UTC m=+902.183880204" watchObservedRunningTime="2026-02-17 13:40:28.080416697 +0000 UTC m=+902.191836034" Feb 17 13:40:28 crc kubenswrapper[4804]: I0217 13:40:28.091540 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" podStartSLOduration=3.905836657 podStartE2EDuration="6.091515816s" podCreationTimestamp="2026-02-17 13:40:22 +0000 UTC" firstStartedPulling="2026-02-17 13:40:24.925946684 +0000 UTC m=+899.037366031" lastFinishedPulling="2026-02-17 13:40:27.111625853 +0000 UTC m=+901.223045190" observedRunningTime="2026-02-17 13:40:28.088430709 +0000 UTC m=+902.199850056" watchObservedRunningTime="2026-02-17 13:40:28.091515816 +0000 UTC m=+902.202935153" Feb 17 13:40:29 crc kubenswrapper[4804]: I0217 13:40:29.076482 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" event={"ID":"2158c202-5aa4-47aa-87a1-73e4b9043e78","Type":"ContainerStarted","Data":"6ca5b5650b9eeab96ea3c8a13f711527a677e9d1df0164b708cdad34f6ce0e7b"} Feb 17 13:40:29 crc kubenswrapper[4804]: I0217 13:40:29.093361 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" podStartSLOduration=2.210334902 podStartE2EDuration="6.093329957s" podCreationTimestamp="2026-02-17 13:40:23 +0000 UTC" firstStartedPulling="2026-02-17 13:40:24.944977412 +0000 UTC m=+899.056396749" lastFinishedPulling="2026-02-17 13:40:28.827972467 +0000 UTC m=+902.939391804" observedRunningTime="2026-02-17 13:40:29.091926332 +0000 UTC m=+903.203345669" watchObservedRunningTime="2026-02-17 13:40:29.093329957 +0000 UTC m=+903.204749294" Feb 17 13:40:30 crc kubenswrapper[4804]: I0217 13:40:30.084752 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz" event={"ID":"18e3c061-8633-471f-b2ab-e87e3c0b5d44","Type":"ContainerStarted","Data":"08b91bb4e93da5f46156e9709ce0c854133d52014bfb7c7cd1a5329e6addded0"} Feb 17 13:40:30 crc kubenswrapper[4804]: I0217 13:40:30.111357 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz" podStartSLOduration=3.233420036 podStartE2EDuration="8.111338376s" podCreationTimestamp="2026-02-17 13:40:22 +0000 UTC" firstStartedPulling="2026-02-17 13:40:24.960573701 +0000 UTC m=+899.071993038" lastFinishedPulling="2026-02-17 13:40:29.838492041 +0000 UTC m=+903.949911378" observedRunningTime="2026-02-17 13:40:30.107940839 +0000 UTC m=+904.219360216" watchObservedRunningTime="2026-02-17 13:40:30.111338376 +0000 UTC m=+904.222757733" Feb 17 13:40:31 crc kubenswrapper[4804]: I0217 13:40:31.237002 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:31 crc kubenswrapper[4804]: I0217 13:40:31.237501 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:31 crc kubenswrapper[4804]: I0217 13:40:31.297353 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:32 crc kubenswrapper[4804]: I0217 13:40:32.156460 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:32 crc kubenswrapper[4804]: I0217 13:40:32.202890 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m57vw"] Feb 17 13:40:33 crc kubenswrapper[4804]: I0217 13:40:33.149493 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:33 crc kubenswrapper[4804]: I0217 13:40:33.201951 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:33 crc kubenswrapper[4804]: I0217 13:40:33.418644 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:33 crc kubenswrapper[4804]: I0217 13:40:33.716005 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:33 crc kubenswrapper[4804]: I0217 13:40:33.716077 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:33 crc kubenswrapper[4804]: I0217 13:40:33.720778 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:33 crc kubenswrapper[4804]: I0217 13:40:33.933183 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m9764"] Feb 17 13:40:34 crc kubenswrapper[4804]: I0217 13:40:34.119573 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m57vw" podUID="e364f68f-7e6e-4f69-8884-19064e2ab186" containerName="registry-server" containerID="cri-o://936d92768f8545882fd9f589c352b0f3e05694fdb88b93635d612b3de2273f31" gracePeriod=2 Feb 17 13:40:34 crc kubenswrapper[4804]: I0217 13:40:34.130575 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:34 crc kubenswrapper[4804]: I0217 13:40:34.209554 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-tz5vz"] Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.133058 4804 generic.go:334] "Generic (PLEG): container finished" podID="e364f68f-7e6e-4f69-8884-19064e2ab186" containerID="936d92768f8545882fd9f589c352b0f3e05694fdb88b93635d612b3de2273f31" exitCode=0 Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.133149 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m57vw" event={"ID":"e364f68f-7e6e-4f69-8884-19064e2ab186","Type":"ContainerDied","Data":"936d92768f8545882fd9f589c352b0f3e05694fdb88b93635d612b3de2273f31"} Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.134448 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m57vw" event={"ID":"e364f68f-7e6e-4f69-8884-19064e2ab186","Type":"ContainerDied","Data":"555bccc3d2d7e40f2defa172b9440725c7c1ecdee867fd2abee771e51caa9692"} Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.134499 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="555bccc3d2d7e40f2defa172b9440725c7c1ecdee867fd2abee771e51caa9692" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.134630 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m9764" podUID="360df31e-5543-40bc-a507-76ce8c336d42" containerName="registry-server" containerID="cri-o://6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2" gracePeriod=2 Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.143711 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.269691 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-utilities\") pod \"e364f68f-7e6e-4f69-8884-19064e2ab186\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.269769 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-catalog-content\") pod \"e364f68f-7e6e-4f69-8884-19064e2ab186\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.269801 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvhln\" (UniqueName: \"kubernetes.io/projected/e364f68f-7e6e-4f69-8884-19064e2ab186-kube-api-access-mvhln\") pod \"e364f68f-7e6e-4f69-8884-19064e2ab186\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.270775 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-utilities" (OuterVolumeSpecName: "utilities") pod "e364f68f-7e6e-4f69-8884-19064e2ab186" (UID: "e364f68f-7e6e-4f69-8884-19064e2ab186"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.278130 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e364f68f-7e6e-4f69-8884-19064e2ab186-kube-api-access-mvhln" (OuterVolumeSpecName: "kube-api-access-mvhln") pod "e364f68f-7e6e-4f69-8884-19064e2ab186" (UID: "e364f68f-7e6e-4f69-8884-19064e2ab186"). InnerVolumeSpecName "kube-api-access-mvhln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.296786 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e364f68f-7e6e-4f69-8884-19064e2ab186" (UID: "e364f68f-7e6e-4f69-8884-19064e2ab186"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.372387 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.372427 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.372440 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvhln\" (UniqueName: \"kubernetes.io/projected/e364f68f-7e6e-4f69-8884-19064e2ab186-kube-api-access-mvhln\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.485477 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.573955 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-utilities\") pod \"360df31e-5543-40bc-a507-76ce8c336d42\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.574027 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-catalog-content\") pod \"360df31e-5543-40bc-a507-76ce8c336d42\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.574114 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp5hv\" (UniqueName: \"kubernetes.io/projected/360df31e-5543-40bc-a507-76ce8c336d42-kube-api-access-sp5hv\") pod \"360df31e-5543-40bc-a507-76ce8c336d42\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.575406 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-utilities" (OuterVolumeSpecName: "utilities") pod "360df31e-5543-40bc-a507-76ce8c336d42" (UID: "360df31e-5543-40bc-a507-76ce8c336d42"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.578450 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/360df31e-5543-40bc-a507-76ce8c336d42-kube-api-access-sp5hv" (OuterVolumeSpecName: "kube-api-access-sp5hv") pod "360df31e-5543-40bc-a507-76ce8c336d42" (UID: "360df31e-5543-40bc-a507-76ce8c336d42"). InnerVolumeSpecName "kube-api-access-sp5hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.676028 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.676266 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp5hv\" (UniqueName: \"kubernetes.io/projected/360df31e-5543-40bc-a507-76ce8c336d42-kube-api-access-sp5hv\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.693069 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "360df31e-5543-40bc-a507-76ce8c336d42" (UID: "360df31e-5543-40bc-a507-76ce8c336d42"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.777413 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.149266 4804 generic.go:334] "Generic (PLEG): container finished" podID="360df31e-5543-40bc-a507-76ce8c336d42" containerID="6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2" exitCode=0 Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.149368 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.149390 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.149401 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9764" event={"ID":"360df31e-5543-40bc-a507-76ce8c336d42","Type":"ContainerDied","Data":"6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2"} Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.149510 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9764" event={"ID":"360df31e-5543-40bc-a507-76ce8c336d42","Type":"ContainerDied","Data":"16a653f264128e5b4be27fbdc5d3721b713d9d72a60c5f9389716871b444025d"} Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.149538 4804 scope.go:117] "RemoveContainer" containerID="6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.178737 4804 scope.go:117] "RemoveContainer" containerID="8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.196413 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m9764"] Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.207303 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m9764"] Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.212321 4804 scope.go:117] "RemoveContainer" containerID="a3f0ddba8829031242e38c416e1b6947c00ac8f0dcde990e7a1bede911bbbcea" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.217400 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m57vw"] Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.220754 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m57vw"] Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.230858 4804 scope.go:117] "RemoveContainer" containerID="6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2" Feb 17 13:40:36 crc kubenswrapper[4804]: E0217 13:40:36.231927 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2\": container with ID starting with 6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2 not found: ID does not exist" containerID="6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.232001 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2"} err="failed to get container status \"6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2\": rpc error: code = NotFound desc = could not find container \"6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2\": container with ID starting with 6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2 not found: ID does not exist" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.232066 4804 scope.go:117] "RemoveContainer" containerID="8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430" Feb 17 13:40:36 crc kubenswrapper[4804]: E0217 13:40:36.232710 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430\": container with ID starting with 8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430 not found: ID does not exist" containerID="8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.232749 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430"} err="failed to get container status \"8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430\": rpc error: code = NotFound desc = could not find container \"8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430\": container with ID starting with 8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430 not found: ID does not exist" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.232786 4804 scope.go:117] "RemoveContainer" containerID="a3f0ddba8829031242e38c416e1b6947c00ac8f0dcde990e7a1bede911bbbcea" Feb 17 13:40:36 crc kubenswrapper[4804]: E0217 13:40:36.233376 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3f0ddba8829031242e38c416e1b6947c00ac8f0dcde990e7a1bede911bbbcea\": container with ID starting with a3f0ddba8829031242e38c416e1b6947c00ac8f0dcde990e7a1bede911bbbcea not found: ID does not exist" containerID="a3f0ddba8829031242e38c416e1b6947c00ac8f0dcde990e7a1bede911bbbcea" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.233423 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3f0ddba8829031242e38c416e1b6947c00ac8f0dcde990e7a1bede911bbbcea"} err="failed to get container status \"a3f0ddba8829031242e38c416e1b6947c00ac8f0dcde990e7a1bede911bbbcea\": rpc error: code = NotFound desc = could not find container \"a3f0ddba8829031242e38c416e1b6947c00ac8f0dcde990e7a1bede911bbbcea\": container with ID starting with a3f0ddba8829031242e38c416e1b6947c00ac8f0dcde990e7a1bede911bbbcea not found: ID does not exist" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.581549 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="360df31e-5543-40bc-a507-76ce8c336d42" path="/var/lib/kubelet/pods/360df31e-5543-40bc-a507-76ce8c336d42/volumes" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.582124 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e364f68f-7e6e-4f69-8884-19064e2ab186" path="/var/lib/kubelet/pods/e364f68f-7e6e-4f69-8884-19064e2ab186/volumes" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.339564 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8bzvf"] Feb 17 13:40:39 crc kubenswrapper[4804]: E0217 13:40:39.340149 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="360df31e-5543-40bc-a507-76ce8c336d42" containerName="extract-utilities" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.340161 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="360df31e-5543-40bc-a507-76ce8c336d42" containerName="extract-utilities" Feb 17 13:40:39 crc kubenswrapper[4804]: E0217 13:40:39.340175 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e364f68f-7e6e-4f69-8884-19064e2ab186" containerName="extract-utilities" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.340181 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e364f68f-7e6e-4f69-8884-19064e2ab186" containerName="extract-utilities" Feb 17 13:40:39 crc kubenswrapper[4804]: E0217 13:40:39.340191 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e364f68f-7e6e-4f69-8884-19064e2ab186" containerName="extract-content" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.340217 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e364f68f-7e6e-4f69-8884-19064e2ab186" containerName="extract-content" Feb 17 13:40:39 crc kubenswrapper[4804]: E0217 13:40:39.340229 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e364f68f-7e6e-4f69-8884-19064e2ab186" containerName="registry-server" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.340235 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e364f68f-7e6e-4f69-8884-19064e2ab186" containerName="registry-server" Feb 17 13:40:39 crc kubenswrapper[4804]: E0217 13:40:39.340244 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="360df31e-5543-40bc-a507-76ce8c336d42" containerName="extract-content" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.340249 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="360df31e-5543-40bc-a507-76ce8c336d42" containerName="extract-content" Feb 17 13:40:39 crc kubenswrapper[4804]: E0217 13:40:39.340256 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="360df31e-5543-40bc-a507-76ce8c336d42" containerName="registry-server" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.340262 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="360df31e-5543-40bc-a507-76ce8c336d42" containerName="registry-server" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.340398 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="360df31e-5543-40bc-a507-76ce8c336d42" containerName="registry-server" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.340407 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e364f68f-7e6e-4f69-8884-19064e2ab186" containerName="registry-server" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.341261 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.358129 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8bzvf"] Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.427995 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-catalog-content\") pod \"community-operators-8bzvf\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.428147 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gc4t\" (UniqueName: \"kubernetes.io/projected/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-kube-api-access-9gc4t\") pod \"community-operators-8bzvf\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.428179 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-utilities\") pod \"community-operators-8bzvf\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.529117 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-catalog-content\") pod \"community-operators-8bzvf\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.529178 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gc4t\" (UniqueName: \"kubernetes.io/projected/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-kube-api-access-9gc4t\") pod \"community-operators-8bzvf\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.529218 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-utilities\") pod \"community-operators-8bzvf\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.529716 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-utilities\") pod \"community-operators-8bzvf\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.529955 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-catalog-content\") pod \"community-operators-8bzvf\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.551798 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gc4t\" (UniqueName: \"kubernetes.io/projected/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-kube-api-access-9gc4t\") pod \"community-operators-8bzvf\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.675718 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.969117 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8bzvf"] Feb 17 13:40:40 crc kubenswrapper[4804]: I0217 13:40:40.180468 4804 generic.go:334] "Generic (PLEG): container finished" podID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" containerID="85b9580c0a64570024cd4eea5cf9d7ed8c24206dba24a4136fbb7a1a33535695" exitCode=0 Feb 17 13:40:40 crc kubenswrapper[4804]: I0217 13:40:40.180519 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bzvf" event={"ID":"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c","Type":"ContainerDied","Data":"85b9580c0a64570024cd4eea5cf9d7ed8c24206dba24a4136fbb7a1a33535695"} Feb 17 13:40:40 crc kubenswrapper[4804]: I0217 13:40:40.180544 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bzvf" event={"ID":"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c","Type":"ContainerStarted","Data":"7c939a790c69d09c4cd698a95d3c6e66cbf9bcb5e1dee342b73c64ad91892bab"} Feb 17 13:40:41 crc kubenswrapper[4804]: I0217 13:40:41.194859 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bzvf" event={"ID":"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c","Type":"ContainerStarted","Data":"214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5"} Feb 17 13:40:42 crc kubenswrapper[4804]: I0217 13:40:42.203303 4804 generic.go:334] "Generic (PLEG): container finished" podID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" containerID="214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5" exitCode=0 Feb 17 13:40:42 crc kubenswrapper[4804]: I0217 13:40:42.203420 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bzvf" event={"ID":"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c","Type":"ContainerDied","Data":"214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5"} Feb 17 13:40:42 crc kubenswrapper[4804]: I0217 13:40:42.203907 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bzvf" event={"ID":"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c","Type":"ContainerStarted","Data":"488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe"} Feb 17 13:40:42 crc kubenswrapper[4804]: I0217 13:40:42.223506 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8bzvf" podStartSLOduration=1.829693815 podStartE2EDuration="3.223485862s" podCreationTimestamp="2026-02-17 13:40:39 +0000 UTC" firstStartedPulling="2026-02-17 13:40:40.182245289 +0000 UTC m=+914.293664636" lastFinishedPulling="2026-02-17 13:40:41.576037306 +0000 UTC m=+915.687456683" observedRunningTime="2026-02-17 13:40:42.222665456 +0000 UTC m=+916.334084793" watchObservedRunningTime="2026-02-17 13:40:42.223485862 +0000 UTC m=+916.334905199" Feb 17 13:40:43 crc kubenswrapper[4804]: I0217 13:40:43.928816 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.141096 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d2786"] Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.142662 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.161144 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d2786"] Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.200848 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-catalog-content\") pod \"certified-operators-d2786\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.200926 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzz5c\" (UniqueName: \"kubernetes.io/projected/3d13c70f-ee22-4434-ae7a-92e62c3caa26-kube-api-access-pzz5c\") pod \"certified-operators-d2786\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.201115 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-utilities\") pod \"certified-operators-d2786\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.302216 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-utilities\") pod \"certified-operators-d2786\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.302308 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-catalog-content\") pod \"certified-operators-d2786\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.302352 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzz5c\" (UniqueName: \"kubernetes.io/projected/3d13c70f-ee22-4434-ae7a-92e62c3caa26-kube-api-access-pzz5c\") pod \"certified-operators-d2786\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.302804 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-utilities\") pod \"certified-operators-d2786\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.302831 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-catalog-content\") pod \"certified-operators-d2786\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.328377 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzz5c\" (UniqueName: \"kubernetes.io/projected/3d13c70f-ee22-4434-ae7a-92e62c3caa26-kube-api-access-pzz5c\") pod \"certified-operators-d2786\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.459891 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.923114 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d2786"] Feb 17 13:40:45 crc kubenswrapper[4804]: I0217 13:40:45.223234 4804 generic.go:334] "Generic (PLEG): container finished" podID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" containerID="2841989cfb43995c971c4405cddc2c9830da84b3d169d1f91be7e47313003065" exitCode=0 Feb 17 13:40:45 crc kubenswrapper[4804]: I0217 13:40:45.223288 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2786" event={"ID":"3d13c70f-ee22-4434-ae7a-92e62c3caa26","Type":"ContainerDied","Data":"2841989cfb43995c971c4405cddc2c9830da84b3d169d1f91be7e47313003065"} Feb 17 13:40:45 crc kubenswrapper[4804]: I0217 13:40:45.223317 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2786" event={"ID":"3d13c70f-ee22-4434-ae7a-92e62c3caa26","Type":"ContainerStarted","Data":"2d5cbab8cf904e1f2afff630660ba9ad4d8260633fdac34c04024ed3278b2e02"} Feb 17 13:40:46 crc kubenswrapper[4804]: I0217 13:40:46.232550 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2786" event={"ID":"3d13c70f-ee22-4434-ae7a-92e62c3caa26","Type":"ContainerStarted","Data":"1fe5b1d23d0bb27a90ac079b9dc22996fe723a550aa2c5391daa2ed178e26f28"} Feb 17 13:40:47 crc kubenswrapper[4804]: I0217 13:40:47.240177 4804 generic.go:334] "Generic (PLEG): container finished" podID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" containerID="1fe5b1d23d0bb27a90ac079b9dc22996fe723a550aa2c5391daa2ed178e26f28" exitCode=0 Feb 17 13:40:47 crc kubenswrapper[4804]: I0217 13:40:47.240236 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2786" event={"ID":"3d13c70f-ee22-4434-ae7a-92e62c3caa26","Type":"ContainerDied","Data":"1fe5b1d23d0bb27a90ac079b9dc22996fe723a550aa2c5391daa2ed178e26f28"} Feb 17 13:40:48 crc kubenswrapper[4804]: I0217 13:40:48.248001 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2786" event={"ID":"3d13c70f-ee22-4434-ae7a-92e62c3caa26","Type":"ContainerStarted","Data":"56ef89b633db8972f2e239aff7c04b1a035f5ce59b613d27884a8c95e1120457"} Feb 17 13:40:48 crc kubenswrapper[4804]: I0217 13:40:48.271328 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d2786" podStartSLOduration=1.850627993 podStartE2EDuration="4.271307979s" podCreationTimestamp="2026-02-17 13:40:44 +0000 UTC" firstStartedPulling="2026-02-17 13:40:45.225774967 +0000 UTC m=+919.337194314" lastFinishedPulling="2026-02-17 13:40:47.646454963 +0000 UTC m=+921.757874300" observedRunningTime="2026-02-17 13:40:48.266697005 +0000 UTC m=+922.378116382" watchObservedRunningTime="2026-02-17 13:40:48.271307979 +0000 UTC m=+922.382727336" Feb 17 13:40:49 crc kubenswrapper[4804]: I0217 13:40:49.676578 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:49 crc kubenswrapper[4804]: I0217 13:40:49.676860 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:49 crc kubenswrapper[4804]: I0217 13:40:49.716431 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:50 crc kubenswrapper[4804]: I0217 13:40:50.311401 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:50 crc kubenswrapper[4804]: I0217 13:40:50.731233 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8bzvf"] Feb 17 13:40:52 crc kubenswrapper[4804]: I0217 13:40:52.288641 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8bzvf" podUID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" containerName="registry-server" containerID="cri-o://488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe" gracePeriod=2 Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.247982 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.294749 4804 generic.go:334] "Generic (PLEG): container finished" podID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" containerID="488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe" exitCode=0 Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.294797 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bzvf" event={"ID":"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c","Type":"ContainerDied","Data":"488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe"} Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.294820 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.294830 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bzvf" event={"ID":"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c","Type":"ContainerDied","Data":"7c939a790c69d09c4cd698a95d3c6e66cbf9bcb5e1dee342b73c64ad91892bab"} Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.294854 4804 scope.go:117] "RemoveContainer" containerID="488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.311648 4804 scope.go:117] "RemoveContainer" containerID="214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.325992 4804 scope.go:117] "RemoveContainer" containerID="85b9580c0a64570024cd4eea5cf9d7ed8c24206dba24a4136fbb7a1a33535695" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.340064 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-utilities\") pod \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.340150 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gc4t\" (UniqueName: \"kubernetes.io/projected/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-kube-api-access-9gc4t\") pod \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.341296 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-catalog-content\") pod \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.341400 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-utilities" (OuterVolumeSpecName: "utilities") pod "5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" (UID: "5ed30e2e-0f9b-4fa9-8ba1-049ae539875c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.341809 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.345647 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-kube-api-access-9gc4t" (OuterVolumeSpecName: "kube-api-access-9gc4t") pod "5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" (UID: "5ed30e2e-0f9b-4fa9-8ba1-049ae539875c"). InnerVolumeSpecName "kube-api-access-9gc4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.345661 4804 scope.go:117] "RemoveContainer" containerID="488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe" Feb 17 13:40:53 crc kubenswrapper[4804]: E0217 13:40:53.346157 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe\": container with ID starting with 488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe not found: ID does not exist" containerID="488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.346213 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe"} err="failed to get container status \"488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe\": rpc error: code = NotFound desc = could not find container \"488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe\": container with ID starting with 488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe not found: ID does not exist" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.346240 4804 scope.go:117] "RemoveContainer" containerID="214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5" Feb 17 13:40:53 crc kubenswrapper[4804]: E0217 13:40:53.346671 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5\": container with ID starting with 214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5 not found: ID does not exist" containerID="214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.346711 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5"} err="failed to get container status \"214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5\": rpc error: code = NotFound desc = could not find container \"214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5\": container with ID starting with 214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5 not found: ID does not exist" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.346737 4804 scope.go:117] "RemoveContainer" containerID="85b9580c0a64570024cd4eea5cf9d7ed8c24206dba24a4136fbb7a1a33535695" Feb 17 13:40:53 crc kubenswrapper[4804]: E0217 13:40:53.347374 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85b9580c0a64570024cd4eea5cf9d7ed8c24206dba24a4136fbb7a1a33535695\": container with ID starting with 85b9580c0a64570024cd4eea5cf9d7ed8c24206dba24a4136fbb7a1a33535695 not found: ID does not exist" containerID="85b9580c0a64570024cd4eea5cf9d7ed8c24206dba24a4136fbb7a1a33535695" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.347469 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85b9580c0a64570024cd4eea5cf9d7ed8c24206dba24a4136fbb7a1a33535695"} err="failed to get container status \"85b9580c0a64570024cd4eea5cf9d7ed8c24206dba24a4136fbb7a1a33535695\": rpc error: code = NotFound desc = could not find container \"85b9580c0a64570024cd4eea5cf9d7ed8c24206dba24a4136fbb7a1a33535695\": container with ID starting with 85b9580c0a64570024cd4eea5cf9d7ed8c24206dba24a4136fbb7a1a33535695 not found: ID does not exist" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.394324 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" (UID: "5ed30e2e-0f9b-4fa9-8ba1-049ae539875c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.442766 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gc4t\" (UniqueName: \"kubernetes.io/projected/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-kube-api-access-9gc4t\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.442795 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.626225 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8bzvf"] Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.650054 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8bzvf"] Feb 17 13:40:54 crc kubenswrapper[4804]: I0217 13:40:54.460382 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:54 crc kubenswrapper[4804]: I0217 13:40:54.460667 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:54 crc kubenswrapper[4804]: I0217 13:40:54.507463 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:54 crc kubenswrapper[4804]: I0217 13:40:54.582306 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" path="/var/lib/kubelet/pods/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c/volumes" Feb 17 13:40:55 crc kubenswrapper[4804]: I0217 13:40:55.354076 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:56 crc kubenswrapper[4804]: I0217 13:40:56.130997 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d2786"] Feb 17 13:40:57 crc kubenswrapper[4804]: I0217 13:40:57.319618 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d2786" podUID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" containerName="registry-server" containerID="cri-o://56ef89b633db8972f2e239aff7c04b1a035f5ce59b613d27884a8c95e1120457" gracePeriod=2 Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.001516 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h"] Feb 17 13:40:58 crc kubenswrapper[4804]: E0217 13:40:58.001923 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" containerName="extract-utilities" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.001952 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" containerName="extract-utilities" Feb 17 13:40:58 crc kubenswrapper[4804]: E0217 13:40:58.001981 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" containerName="registry-server" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.001993 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" containerName="registry-server" Feb 17 13:40:58 crc kubenswrapper[4804]: E0217 13:40:58.002019 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" containerName="extract-content" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.002031 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" containerName="extract-content" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.002266 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" containerName="registry-server" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.004091 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.006138 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.017997 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h"] Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.120251 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.120595 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2dll\" (UniqueName: \"kubernetes.io/projected/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-kube-api-access-x2dll\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.120645 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.221719 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.221806 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2dll\" (UniqueName: \"kubernetes.io/projected/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-kube-api-access-x2dll\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.221831 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.222357 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.222617 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.246766 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2dll\" (UniqueName: \"kubernetes.io/projected/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-kube-api-access-x2dll\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.327129 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2786" event={"ID":"3d13c70f-ee22-4434-ae7a-92e62c3caa26","Type":"ContainerDied","Data":"56ef89b633db8972f2e239aff7c04b1a035f5ce59b613d27884a8c95e1120457"} Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.327059 4804 generic.go:334] "Generic (PLEG): container finished" podID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" containerID="56ef89b633db8972f2e239aff7c04b1a035f5ce59b613d27884a8c95e1120457" exitCode=0 Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.336119 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.523877 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h"] Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.839382 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.933243 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-catalog-content\") pod \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.933330 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzz5c\" (UniqueName: \"kubernetes.io/projected/3d13c70f-ee22-4434-ae7a-92e62c3caa26-kube-api-access-pzz5c\") pod \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.933379 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-utilities\") pod \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.934671 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-utilities" (OuterVolumeSpecName: "utilities") pod "3d13c70f-ee22-4434-ae7a-92e62c3caa26" (UID: "3d13c70f-ee22-4434-ae7a-92e62c3caa26"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.940248 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d13c70f-ee22-4434-ae7a-92e62c3caa26-kube-api-access-pzz5c" (OuterVolumeSpecName: "kube-api-access-pzz5c") pod "3d13c70f-ee22-4434-ae7a-92e62c3caa26" (UID: "3d13c70f-ee22-4434-ae7a-92e62c3caa26"). InnerVolumeSpecName "kube-api-access-pzz5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.009807 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d13c70f-ee22-4434-ae7a-92e62c3caa26" (UID: "3d13c70f-ee22-4434-ae7a-92e62c3caa26"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.034821 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.034857 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.034871 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzz5c\" (UniqueName: \"kubernetes.io/projected/3d13c70f-ee22-4434-ae7a-92e62c3caa26-kube-api-access-pzz5c\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.266033 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-tz5vz" podUID="9eb6b4b9-9e2e-4f39-9df0-068cfea71701" containerName="console" containerID="cri-o://f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4" gracePeriod=15 Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.335726 4804 generic.go:334] "Generic (PLEG): container finished" podID="7e8c98d2-433f-46f9-a2f3-3a368c1b2608" containerID="a312566d330b5b43cd5b5e5db5ba88efab0b15fe3b0a36d64c0962e38572777f" exitCode=0 Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.335812 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" event={"ID":"7e8c98d2-433f-46f9-a2f3-3a368c1b2608","Type":"ContainerDied","Data":"a312566d330b5b43cd5b5e5db5ba88efab0b15fe3b0a36d64c0962e38572777f"} Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.335842 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" event={"ID":"7e8c98d2-433f-46f9-a2f3-3a368c1b2608","Type":"ContainerStarted","Data":"ed4b034533ab832fa3d6d9792b943b6994648cbfe779c359e76481aaf00925de"} Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.339890 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2786" event={"ID":"3d13c70f-ee22-4434-ae7a-92e62c3caa26","Type":"ContainerDied","Data":"2d5cbab8cf904e1f2afff630660ba9ad4d8260633fdac34c04024ed3278b2e02"} Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.340246 4804 scope.go:117] "RemoveContainer" containerID="56ef89b633db8972f2e239aff7c04b1a035f5ce59b613d27884a8c95e1120457" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.340400 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.411116 4804 scope.go:117] "RemoveContainer" containerID="1fe5b1d23d0bb27a90ac079b9dc22996fe723a550aa2c5391daa2ed178e26f28" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.412312 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d2786"] Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.419372 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d2786"] Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.431450 4804 scope.go:117] "RemoveContainer" containerID="2841989cfb43995c971c4405cddc2c9830da84b3d169d1f91be7e47313003065" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.600266 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-tz5vz_9eb6b4b9-9e2e-4f39-9df0-068cfea71701/console/0.log" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.600581 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.744865 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-serving-cert\") pod \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.744926 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-service-ca\") pod \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.744946 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq22q\" (UniqueName: \"kubernetes.io/projected/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-kube-api-access-xq22q\") pod \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.745001 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-config\") pod \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.745028 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-trusted-ca-bundle\") pod \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.745094 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-oauth-config\") pod \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.745117 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-oauth-serving-cert\") pod \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.745807 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9eb6b4b9-9e2e-4f39-9df0-068cfea71701" (UID: "9eb6b4b9-9e2e-4f39-9df0-068cfea71701"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.745813 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9eb6b4b9-9e2e-4f39-9df0-068cfea71701" (UID: "9eb6b4b9-9e2e-4f39-9df0-068cfea71701"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.745881 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-service-ca" (OuterVolumeSpecName: "service-ca") pod "9eb6b4b9-9e2e-4f39-9df0-068cfea71701" (UID: "9eb6b4b9-9e2e-4f39-9df0-068cfea71701"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.746142 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-config" (OuterVolumeSpecName: "console-config") pod "9eb6b4b9-9e2e-4f39-9df0-068cfea71701" (UID: "9eb6b4b9-9e2e-4f39-9df0-068cfea71701"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.756123 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9eb6b4b9-9e2e-4f39-9df0-068cfea71701" (UID: "9eb6b4b9-9e2e-4f39-9df0-068cfea71701"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.756306 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-kube-api-access-xq22q" (OuterVolumeSpecName: "kube-api-access-xq22q") pod "9eb6b4b9-9e2e-4f39-9df0-068cfea71701" (UID: "9eb6b4b9-9e2e-4f39-9df0-068cfea71701"). InnerVolumeSpecName "kube-api-access-xq22q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.756465 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9eb6b4b9-9e2e-4f39-9df0-068cfea71701" (UID: "9eb6b4b9-9e2e-4f39-9df0-068cfea71701"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.847283 4804 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.847322 4804 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.847332 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xq22q\" (UniqueName: \"kubernetes.io/projected/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-kube-api-access-xq22q\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.847342 4804 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.847353 4804 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.847361 4804 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.847369 4804 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.347979 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-tz5vz_9eb6b4b9-9e2e-4f39-9df0-068cfea71701/console/0.log" Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.348323 4804 generic.go:334] "Generic (PLEG): container finished" podID="9eb6b4b9-9e2e-4f39-9df0-068cfea71701" containerID="f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4" exitCode=2 Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.348398 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tz5vz" event={"ID":"9eb6b4b9-9e2e-4f39-9df0-068cfea71701","Type":"ContainerDied","Data":"f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4"} Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.348438 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tz5vz" event={"ID":"9eb6b4b9-9e2e-4f39-9df0-068cfea71701","Type":"ContainerDied","Data":"981cd8ca6939145b19efcac42c0b745084dc50ef247139e74f5af40d78e085ba"} Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.348402 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.348464 4804 scope.go:117] "RemoveContainer" containerID="f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4" Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.386427 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-tz5vz"] Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.387249 4804 scope.go:117] "RemoveContainer" containerID="f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4" Feb 17 13:41:00 crc kubenswrapper[4804]: E0217 13:41:00.388637 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4\": container with ID starting with f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4 not found: ID does not exist" containerID="f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4" Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.388687 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4"} err="failed to get container status \"f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4\": rpc error: code = NotFound desc = could not find container \"f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4\": container with ID starting with f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4 not found: ID does not exist" Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.394966 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-tz5vz"] Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.579269 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" path="/var/lib/kubelet/pods/3d13c70f-ee22-4434-ae7a-92e62c3caa26/volumes" Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.580056 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9eb6b4b9-9e2e-4f39-9df0-068cfea71701" path="/var/lib/kubelet/pods/9eb6b4b9-9e2e-4f39-9df0-068cfea71701/volumes" Feb 17 13:41:01 crc kubenswrapper[4804]: I0217 13:41:01.359990 4804 generic.go:334] "Generic (PLEG): container finished" podID="7e8c98d2-433f-46f9-a2f3-3a368c1b2608" containerID="079f52519ec7defe8849f5cf7a1dd012d358385362721d77c032579b21d6da77" exitCode=0 Feb 17 13:41:01 crc kubenswrapper[4804]: I0217 13:41:01.360096 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" event={"ID":"7e8c98d2-433f-46f9-a2f3-3a368c1b2608","Type":"ContainerDied","Data":"079f52519ec7defe8849f5cf7a1dd012d358385362721d77c032579b21d6da77"} Feb 17 13:41:02 crc kubenswrapper[4804]: I0217 13:41:02.379346 4804 generic.go:334] "Generic (PLEG): container finished" podID="7e8c98d2-433f-46f9-a2f3-3a368c1b2608" containerID="36d3022c9303016a4bff30c0b94a6e380411ddc9446661e703721a0c232e7d03" exitCode=0 Feb 17 13:41:02 crc kubenswrapper[4804]: I0217 13:41:02.379415 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" event={"ID":"7e8c98d2-433f-46f9-a2f3-3a368c1b2608","Type":"ContainerDied","Data":"36d3022c9303016a4bff30c0b94a6e380411ddc9446661e703721a0c232e7d03"} Feb 17 13:41:03 crc kubenswrapper[4804]: I0217 13:41:03.686520 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:41:03 crc kubenswrapper[4804]: I0217 13:41:03.808175 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2dll\" (UniqueName: \"kubernetes.io/projected/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-kube-api-access-x2dll\") pod \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " Feb 17 13:41:03 crc kubenswrapper[4804]: I0217 13:41:03.808249 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-util\") pod \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " Feb 17 13:41:03 crc kubenswrapper[4804]: I0217 13:41:03.808278 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-bundle\") pod \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " Feb 17 13:41:03 crc kubenswrapper[4804]: I0217 13:41:03.809370 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-bundle" (OuterVolumeSpecName: "bundle") pod "7e8c98d2-433f-46f9-a2f3-3a368c1b2608" (UID: "7e8c98d2-433f-46f9-a2f3-3a368c1b2608"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:41:03 crc kubenswrapper[4804]: I0217 13:41:03.813687 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-kube-api-access-x2dll" (OuterVolumeSpecName: "kube-api-access-x2dll") pod "7e8c98d2-433f-46f9-a2f3-3a368c1b2608" (UID: "7e8c98d2-433f-46f9-a2f3-3a368c1b2608"). InnerVolumeSpecName "kube-api-access-x2dll". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:41:03 crc kubenswrapper[4804]: I0217 13:41:03.839701 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-util" (OuterVolumeSpecName: "util") pod "7e8c98d2-433f-46f9-a2f3-3a368c1b2608" (UID: "7e8c98d2-433f-46f9-a2f3-3a368c1b2608"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:41:03 crc kubenswrapper[4804]: I0217 13:41:03.909993 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2dll\" (UniqueName: \"kubernetes.io/projected/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-kube-api-access-x2dll\") on node \"crc\" DevicePath \"\"" Feb 17 13:41:03 crc kubenswrapper[4804]: I0217 13:41:03.910025 4804 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-util\") on node \"crc\" DevicePath \"\"" Feb 17 13:41:03 crc kubenswrapper[4804]: I0217 13:41:03.910041 4804 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:41:04 crc kubenswrapper[4804]: I0217 13:41:04.396494 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" event={"ID":"7e8c98d2-433f-46f9-a2f3-3a368c1b2608","Type":"ContainerDied","Data":"ed4b034533ab832fa3d6d9792b943b6994648cbfe779c359e76481aaf00925de"} Feb 17 13:41:04 crc kubenswrapper[4804]: I0217 13:41:04.396541 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:41:04 crc kubenswrapper[4804]: I0217 13:41:04.396555 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed4b034533ab832fa3d6d9792b943b6994648cbfe779c359e76481aaf00925de" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.841316 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb"] Feb 17 13:41:11 crc kubenswrapper[4804]: E0217 13:41:11.842086 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eb6b4b9-9e2e-4f39-9df0-068cfea71701" containerName="console" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.842099 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eb6b4b9-9e2e-4f39-9df0-068cfea71701" containerName="console" Feb 17 13:41:11 crc kubenswrapper[4804]: E0217 13:41:11.842109 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" containerName="extract-utilities" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.842116 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" containerName="extract-utilities" Feb 17 13:41:11 crc kubenswrapper[4804]: E0217 13:41:11.842125 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e8c98d2-433f-46f9-a2f3-3a368c1b2608" containerName="pull" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.842131 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e8c98d2-433f-46f9-a2f3-3a368c1b2608" containerName="pull" Feb 17 13:41:11 crc kubenswrapper[4804]: E0217 13:41:11.842139 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" containerName="registry-server" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.842145 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" containerName="registry-server" Feb 17 13:41:11 crc kubenswrapper[4804]: E0217 13:41:11.842159 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e8c98d2-433f-46f9-a2f3-3a368c1b2608" containerName="extract" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.842166 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e8c98d2-433f-46f9-a2f3-3a368c1b2608" containerName="extract" Feb 17 13:41:11 crc kubenswrapper[4804]: E0217 13:41:11.842178 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" containerName="extract-content" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.842185 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" containerName="extract-content" Feb 17 13:41:11 crc kubenswrapper[4804]: E0217 13:41:11.842226 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e8c98d2-433f-46f9-a2f3-3a368c1b2608" containerName="util" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.842234 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e8c98d2-433f-46f9-a2f3-3a368c1b2608" containerName="util" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.842351 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" containerName="registry-server" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.842363 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e8c98d2-433f-46f9-a2f3-3a368c1b2608" containerName="extract" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.842381 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="9eb6b4b9-9e2e-4f39-9df0-068cfea71701" containerName="console" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.842855 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.844743 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.844743 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.845608 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.845870 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.846084 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-pwpsc" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.862311 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb"] Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.009239 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c17333d4-cfc6-4129-af9e-a8f2db54988b-apiservice-cert\") pod \"metallb-operator-controller-manager-c7c468df9-kbjlb\" (UID: \"c17333d4-cfc6-4129-af9e-a8f2db54988b\") " pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.009311 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c17333d4-cfc6-4129-af9e-a8f2db54988b-webhook-cert\") pod \"metallb-operator-controller-manager-c7c468df9-kbjlb\" (UID: \"c17333d4-cfc6-4129-af9e-a8f2db54988b\") " pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.009329 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q2zj\" (UniqueName: \"kubernetes.io/projected/c17333d4-cfc6-4129-af9e-a8f2db54988b-kube-api-access-9q2zj\") pod \"metallb-operator-controller-manager-c7c468df9-kbjlb\" (UID: \"c17333d4-cfc6-4129-af9e-a8f2db54988b\") " pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.110988 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c17333d4-cfc6-4129-af9e-a8f2db54988b-apiservice-cert\") pod \"metallb-operator-controller-manager-c7c468df9-kbjlb\" (UID: \"c17333d4-cfc6-4129-af9e-a8f2db54988b\") " pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.111073 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c17333d4-cfc6-4129-af9e-a8f2db54988b-webhook-cert\") pod \"metallb-operator-controller-manager-c7c468df9-kbjlb\" (UID: \"c17333d4-cfc6-4129-af9e-a8f2db54988b\") " pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.111095 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q2zj\" (UniqueName: \"kubernetes.io/projected/c17333d4-cfc6-4129-af9e-a8f2db54988b-kube-api-access-9q2zj\") pod \"metallb-operator-controller-manager-c7c468df9-kbjlb\" (UID: \"c17333d4-cfc6-4129-af9e-a8f2db54988b\") " pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.119696 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c17333d4-cfc6-4129-af9e-a8f2db54988b-webhook-cert\") pod \"metallb-operator-controller-manager-c7c468df9-kbjlb\" (UID: \"c17333d4-cfc6-4129-af9e-a8f2db54988b\") " pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.129730 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c17333d4-cfc6-4129-af9e-a8f2db54988b-apiservice-cert\") pod \"metallb-operator-controller-manager-c7c468df9-kbjlb\" (UID: \"c17333d4-cfc6-4129-af9e-a8f2db54988b\") " pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.142025 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q2zj\" (UniqueName: \"kubernetes.io/projected/c17333d4-cfc6-4129-af9e-a8f2db54988b-kube-api-access-9q2zj\") pod \"metallb-operator-controller-manager-c7c468df9-kbjlb\" (UID: \"c17333d4-cfc6-4129-af9e-a8f2db54988b\") " pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.144663 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt"] Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.145339 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.152257 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.152566 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.153070 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-h58kr" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.158246 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.167818 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt"] Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.313411 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/82716046-7f15-43d7-b9de-8fdb68a44c0b-webhook-cert\") pod \"metallb-operator-webhook-server-996ff79d9-vm8dt\" (UID: \"82716046-7f15-43d7-b9de-8fdb68a44c0b\") " pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.313751 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/82716046-7f15-43d7-b9de-8fdb68a44c0b-apiservice-cert\") pod \"metallb-operator-webhook-server-996ff79d9-vm8dt\" (UID: \"82716046-7f15-43d7-b9de-8fdb68a44c0b\") " pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.313808 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbp76\" (UniqueName: \"kubernetes.io/projected/82716046-7f15-43d7-b9de-8fdb68a44c0b-kube-api-access-bbp76\") pod \"metallb-operator-webhook-server-996ff79d9-vm8dt\" (UID: \"82716046-7f15-43d7-b9de-8fdb68a44c0b\") " pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.414706 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/82716046-7f15-43d7-b9de-8fdb68a44c0b-webhook-cert\") pod \"metallb-operator-webhook-server-996ff79d9-vm8dt\" (UID: \"82716046-7f15-43d7-b9de-8fdb68a44c0b\") " pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.414764 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/82716046-7f15-43d7-b9de-8fdb68a44c0b-apiservice-cert\") pod \"metallb-operator-webhook-server-996ff79d9-vm8dt\" (UID: \"82716046-7f15-43d7-b9de-8fdb68a44c0b\") " pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.414793 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbp76\" (UniqueName: \"kubernetes.io/projected/82716046-7f15-43d7-b9de-8fdb68a44c0b-kube-api-access-bbp76\") pod \"metallb-operator-webhook-server-996ff79d9-vm8dt\" (UID: \"82716046-7f15-43d7-b9de-8fdb68a44c0b\") " pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.418737 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/82716046-7f15-43d7-b9de-8fdb68a44c0b-apiservice-cert\") pod \"metallb-operator-webhook-server-996ff79d9-vm8dt\" (UID: \"82716046-7f15-43d7-b9de-8fdb68a44c0b\") " pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.430881 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/82716046-7f15-43d7-b9de-8fdb68a44c0b-webhook-cert\") pod \"metallb-operator-webhook-server-996ff79d9-vm8dt\" (UID: \"82716046-7f15-43d7-b9de-8fdb68a44c0b\") " pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.431617 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbp76\" (UniqueName: \"kubernetes.io/projected/82716046-7f15-43d7-b9de-8fdb68a44c0b-kube-api-access-bbp76\") pod \"metallb-operator-webhook-server-996ff79d9-vm8dt\" (UID: \"82716046-7f15-43d7-b9de-8fdb68a44c0b\") " pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.445528 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb"] Feb 17 13:41:12 crc kubenswrapper[4804]: W0217 13:41:12.451022 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc17333d4_cfc6_4129_af9e_a8f2db54988b.slice/crio-f9258481e9a188b243a294c267c401977e29b563dc1bfaff0064063e46945866 WatchSource:0}: Error finding container f9258481e9a188b243a294c267c401977e29b563dc1bfaff0064063e46945866: Status 404 returned error can't find the container with id f9258481e9a188b243a294c267c401977e29b563dc1bfaff0064063e46945866 Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.528533 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.722910 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt"] Feb 17 13:41:12 crc kubenswrapper[4804]: W0217 13:41:12.729490 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82716046_7f15_43d7_b9de_8fdb68a44c0b.slice/crio-1ad54914a432bf870ed2a155bd52042a044e33ace0156238e77e63d842730e97 WatchSource:0}: Error finding container 1ad54914a432bf870ed2a155bd52042a044e33ace0156238e77e63d842730e97: Status 404 returned error can't find the container with id 1ad54914a432bf870ed2a155bd52042a044e33ace0156238e77e63d842730e97 Feb 17 13:41:13 crc kubenswrapper[4804]: I0217 13:41:13.459223 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" event={"ID":"c17333d4-cfc6-4129-af9e-a8f2db54988b","Type":"ContainerStarted","Data":"f9258481e9a188b243a294c267c401977e29b563dc1bfaff0064063e46945866"} Feb 17 13:41:13 crc kubenswrapper[4804]: I0217 13:41:13.461075 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" event={"ID":"82716046-7f15-43d7-b9de-8fdb68a44c0b","Type":"ContainerStarted","Data":"1ad54914a432bf870ed2a155bd52042a044e33ace0156238e77e63d842730e97"} Feb 17 13:41:15 crc kubenswrapper[4804]: I0217 13:41:15.483859 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" event={"ID":"c17333d4-cfc6-4129-af9e-a8f2db54988b","Type":"ContainerStarted","Data":"7bd1ef8d29be94d011cdfeb8205fcb4af1e446114c5e0bc34cad73b7049c8bf8"} Feb 17 13:41:15 crc kubenswrapper[4804]: I0217 13:41:15.484283 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:15 crc kubenswrapper[4804]: I0217 13:41:15.511418 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" podStartSLOduration=1.8495849130000002 podStartE2EDuration="4.5114007s" podCreationTimestamp="2026-02-17 13:41:11 +0000 UTC" firstStartedPulling="2026-02-17 13:41:12.454247242 +0000 UTC m=+946.565666579" lastFinishedPulling="2026-02-17 13:41:15.116063029 +0000 UTC m=+949.227482366" observedRunningTime="2026-02-17 13:41:15.509745948 +0000 UTC m=+949.621165285" watchObservedRunningTime="2026-02-17 13:41:15.5114007 +0000 UTC m=+949.622820037" Feb 17 13:41:17 crc kubenswrapper[4804]: I0217 13:41:17.496011 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" event={"ID":"82716046-7f15-43d7-b9de-8fdb68a44c0b","Type":"ContainerStarted","Data":"c009fabd2863594a3f1f3c18019679459783c10d7e581bcda3a7b8fdd4b96759"} Feb 17 13:41:17 crc kubenswrapper[4804]: I0217 13:41:17.496383 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:32 crc kubenswrapper[4804]: I0217 13:41:32.532363 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:32 crc kubenswrapper[4804]: I0217 13:41:32.547499 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" podStartSLOduration=16.074899336 podStartE2EDuration="20.547484859s" podCreationTimestamp="2026-02-17 13:41:12 +0000 UTC" firstStartedPulling="2026-02-17 13:41:12.733271813 +0000 UTC m=+946.844691150" lastFinishedPulling="2026-02-17 13:41:17.205857336 +0000 UTC m=+951.317276673" observedRunningTime="2026-02-17 13:41:17.518901384 +0000 UTC m=+951.630320741" watchObservedRunningTime="2026-02-17 13:41:32.547484859 +0000 UTC m=+966.658904196" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.161289 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.857738 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-5ls9t"] Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.861110 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.863361 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.863593 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-6z9h5" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.863785 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.865759 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp"] Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.866681 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.868419 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.869275 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d003d1c-2370-4291-a035-0ebe8b97cfee-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-gl8tp\" (UID: \"0d003d1c-2370-4291-a035-0ebe8b97cfee\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.869338 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-frr-sockets\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.869364 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hllj\" (UniqueName: \"kubernetes.io/projected/0d003d1c-2370-4291-a035-0ebe8b97cfee-kube-api-access-2hllj\") pod \"frr-k8s-webhook-server-78b44bf5bb-gl8tp\" (UID: \"0d003d1c-2370-4291-a035-0ebe8b97cfee\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.869396 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njbjj\" (UniqueName: \"kubernetes.io/projected/2cf110f6-e70a-45af-a634-744262733250-kube-api-access-njbjj\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.869440 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-metrics\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.869473 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2cf110f6-e70a-45af-a634-744262733250-frr-startup\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.869499 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2cf110f6-e70a-45af-a634-744262733250-metrics-certs\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.869526 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-frr-conf\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.869550 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-reloader\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.876014 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp"] Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.943931 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-wrsrf"] Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.944975 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wrsrf" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.954819 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.954833 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.954978 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-5hn8c" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.955380 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970035 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d003d1c-2370-4291-a035-0ebe8b97cfee-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-gl8tp\" (UID: \"0d003d1c-2370-4291-a035-0ebe8b97cfee\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970095 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-frr-sockets\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970125 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hllj\" (UniqueName: \"kubernetes.io/projected/0d003d1c-2370-4291-a035-0ebe8b97cfee-kube-api-access-2hllj\") pod \"frr-k8s-webhook-server-78b44bf5bb-gl8tp\" (UID: \"0d003d1c-2370-4291-a035-0ebe8b97cfee\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970157 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhngs\" (UniqueName: \"kubernetes.io/projected/ef60181c-19a6-454c-a197-2b0af0ac2edf-kube-api-access-rhngs\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970181 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njbjj\" (UniqueName: \"kubernetes.io/projected/2cf110f6-e70a-45af-a634-744262733250-kube-api-access-njbjj\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: E0217 13:41:52.970238 4804 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Feb 17 13:41:52 crc kubenswrapper[4804]: E0217 13:41:52.970317 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d003d1c-2370-4291-a035-0ebe8b97cfee-cert podName:0d003d1c-2370-4291-a035-0ebe8b97cfee nodeName:}" failed. No retries permitted until 2026-02-17 13:41:53.470297801 +0000 UTC m=+987.581717138 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d003d1c-2370-4291-a035-0ebe8b97cfee-cert") pod "frr-k8s-webhook-server-78b44bf5bb-gl8tp" (UID: "0d003d1c-2370-4291-a035-0ebe8b97cfee") : secret "frr-k8s-webhook-server-cert" not found Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970239 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-metrics\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970522 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ef60181c-19a6-454c-a197-2b0af0ac2edf-metallb-excludel2\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970617 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-frr-sockets\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970645 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-metrics-certs\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970660 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-metrics\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970708 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2cf110f6-e70a-45af-a634-744262733250-frr-startup\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970924 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2cf110f6-e70a-45af-a634-744262733250-metrics-certs\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.971008 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-frr-conf\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.971088 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-reloader\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.971241 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-memberlist\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.971498 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2cf110f6-e70a-45af-a634-744262733250-frr-startup\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.971745 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-frr-conf\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.971974 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-reloader\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.981497 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-wg4pd"] Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.982504 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.988367 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.988861 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2cf110f6-e70a-45af-a634-744262733250-metrics-certs\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.991824 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njbjj\" (UniqueName: \"kubernetes.io/projected/2cf110f6-e70a-45af-a634-744262733250-kube-api-access-njbjj\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.992842 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hllj\" (UniqueName: \"kubernetes.io/projected/0d003d1c-2370-4291-a035-0ebe8b97cfee-kube-api-access-2hllj\") pod \"frr-k8s-webhook-server-78b44bf5bb-gl8tp\" (UID: \"0d003d1c-2370-4291-a035-0ebe8b97cfee\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.004695 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-wg4pd"] Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.072099 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ef60181c-19a6-454c-a197-2b0af0ac2edf-metallb-excludel2\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.072141 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-metrics-certs\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.072174 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-memberlist\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.072209 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01625c42-e1b1-470d-b705-47b30fec457a-cert\") pod \"controller-69bbfbf88f-wg4pd\" (UID: \"01625c42-e1b1-470d-b705-47b30fec457a\") " pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.072259 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjxxq\" (UniqueName: \"kubernetes.io/projected/01625c42-e1b1-470d-b705-47b30fec457a-kube-api-access-jjxxq\") pod \"controller-69bbfbf88f-wg4pd\" (UID: \"01625c42-e1b1-470d-b705-47b30fec457a\") " pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.072283 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01625c42-e1b1-470d-b705-47b30fec457a-metrics-certs\") pod \"controller-69bbfbf88f-wg4pd\" (UID: \"01625c42-e1b1-470d-b705-47b30fec457a\") " pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.072300 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhngs\" (UniqueName: \"kubernetes.io/projected/ef60181c-19a6-454c-a197-2b0af0ac2edf-kube-api-access-rhngs\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.073138 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ef60181c-19a6-454c-a197-2b0af0ac2edf-metallb-excludel2\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:53 crc kubenswrapper[4804]: E0217 13:41:53.073599 4804 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 17 13:41:53 crc kubenswrapper[4804]: E0217 13:41:53.073646 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-memberlist podName:ef60181c-19a6-454c-a197-2b0af0ac2edf nodeName:}" failed. No retries permitted until 2026-02-17 13:41:53.573631606 +0000 UTC m=+987.685050943 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-memberlist") pod "speaker-wrsrf" (UID: "ef60181c-19a6-454c-a197-2b0af0ac2edf") : secret "metallb-memberlist" not found Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.076141 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-metrics-certs\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.095775 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhngs\" (UniqueName: \"kubernetes.io/projected/ef60181c-19a6-454c-a197-2b0af0ac2edf-kube-api-access-rhngs\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.172672 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01625c42-e1b1-470d-b705-47b30fec457a-cert\") pod \"controller-69bbfbf88f-wg4pd\" (UID: \"01625c42-e1b1-470d-b705-47b30fec457a\") " pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.172777 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjxxq\" (UniqueName: \"kubernetes.io/projected/01625c42-e1b1-470d-b705-47b30fec457a-kube-api-access-jjxxq\") pod \"controller-69bbfbf88f-wg4pd\" (UID: \"01625c42-e1b1-470d-b705-47b30fec457a\") " pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.172803 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01625c42-e1b1-470d-b705-47b30fec457a-metrics-certs\") pod \"controller-69bbfbf88f-wg4pd\" (UID: \"01625c42-e1b1-470d-b705-47b30fec457a\") " pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.174228 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.176948 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01625c42-e1b1-470d-b705-47b30fec457a-metrics-certs\") pod \"controller-69bbfbf88f-wg4pd\" (UID: \"01625c42-e1b1-470d-b705-47b30fec457a\") " pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.181303 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.186577 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01625c42-e1b1-470d-b705-47b30fec457a-cert\") pod \"controller-69bbfbf88f-wg4pd\" (UID: \"01625c42-e1b1-470d-b705-47b30fec457a\") " pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.187418 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjxxq\" (UniqueName: \"kubernetes.io/projected/01625c42-e1b1-470d-b705-47b30fec457a-kube-api-access-jjxxq\") pod \"controller-69bbfbf88f-wg4pd\" (UID: \"01625c42-e1b1-470d-b705-47b30fec457a\") " pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.343673 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.478080 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d003d1c-2370-4291-a035-0ebe8b97cfee-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-gl8tp\" (UID: \"0d003d1c-2370-4291-a035-0ebe8b97cfee\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.487475 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d003d1c-2370-4291-a035-0ebe8b97cfee-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-gl8tp\" (UID: \"0d003d1c-2370-4291-a035-0ebe8b97cfee\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.492493 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.579844 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-memberlist\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:53 crc kubenswrapper[4804]: E0217 13:41:53.580060 4804 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 17 13:41:53 crc kubenswrapper[4804]: E0217 13:41:53.580110 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-memberlist podName:ef60181c-19a6-454c-a197-2b0af0ac2edf nodeName:}" failed. No retries permitted until 2026-02-17 13:41:54.580095971 +0000 UTC m=+988.691515308 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-memberlist") pod "speaker-wrsrf" (UID: "ef60181c-19a6-454c-a197-2b0af0ac2edf") : secret "metallb-memberlist" not found Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.722916 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ls9t" event={"ID":"2cf110f6-e70a-45af-a634-744262733250","Type":"ContainerStarted","Data":"3c45f79d8d33e0ffd9427a2eaba1620cde9b11dcc3b49d408e4a8dbea30ad617"} Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.988383 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-wg4pd"] Feb 17 13:41:54 crc kubenswrapper[4804]: I0217 13:41:54.363887 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp"] Feb 17 13:41:54 crc kubenswrapper[4804]: W0217 13:41:54.370862 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d003d1c_2370_4291_a035_0ebe8b97cfee.slice/crio-f96b60bcad8588e9e3fc182da1503fc0a272f815c284e6e6668222bfaaa2960e WatchSource:0}: Error finding container f96b60bcad8588e9e3fc182da1503fc0a272f815c284e6e6668222bfaaa2960e: Status 404 returned error can't find the container with id f96b60bcad8588e9e3fc182da1503fc0a272f815c284e6e6668222bfaaa2960e Feb 17 13:41:54 crc kubenswrapper[4804]: I0217 13:41:54.594794 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-memberlist\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:54 crc kubenswrapper[4804]: I0217 13:41:54.602971 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-memberlist\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:54 crc kubenswrapper[4804]: I0217 13:41:54.728527 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" event={"ID":"0d003d1c-2370-4291-a035-0ebe8b97cfee","Type":"ContainerStarted","Data":"f96b60bcad8588e9e3fc182da1503fc0a272f815c284e6e6668222bfaaa2960e"} Feb 17 13:41:54 crc kubenswrapper[4804]: I0217 13:41:54.730635 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-wg4pd" event={"ID":"01625c42-e1b1-470d-b705-47b30fec457a","Type":"ContainerStarted","Data":"c09b74cc048110353c52f9aae1096e4c8674260ad5f2dcd3da1eda135e5eef04"} Feb 17 13:41:54 crc kubenswrapper[4804]: I0217 13:41:54.731350 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:54 crc kubenswrapper[4804]: I0217 13:41:54.731437 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-wg4pd" event={"ID":"01625c42-e1b1-470d-b705-47b30fec457a","Type":"ContainerStarted","Data":"845e848caa2f8b769d7df1a66ad9e3a5c70a09490ed50bcad12a8bb787f88215"} Feb 17 13:41:54 crc kubenswrapper[4804]: I0217 13:41:54.731506 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-wg4pd" event={"ID":"01625c42-e1b1-470d-b705-47b30fec457a","Type":"ContainerStarted","Data":"ad2b7d78a8e8dee83f207f1660f4f8d5ffa09cb48dad1685ebde4a4db2e3d411"} Feb 17 13:41:54 crc kubenswrapper[4804]: I0217 13:41:54.756482 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-wg4pd" podStartSLOduration=2.7564559600000003 podStartE2EDuration="2.75645596s" podCreationTimestamp="2026-02-17 13:41:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:41:54.746796419 +0000 UTC m=+988.858215766" watchObservedRunningTime="2026-02-17 13:41:54.75645596 +0000 UTC m=+988.867875297" Feb 17 13:41:54 crc kubenswrapper[4804]: I0217 13:41:54.757759 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wrsrf" Feb 17 13:41:54 crc kubenswrapper[4804]: W0217 13:41:54.782489 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef60181c_19a6_454c_a197_2b0af0ac2edf.slice/crio-4b5785b317d5ca6368f1c79b70ee67c3f48131643dfdd643ba8b1d6edcfeb520 WatchSource:0}: Error finding container 4b5785b317d5ca6368f1c79b70ee67c3f48131643dfdd643ba8b1d6edcfeb520: Status 404 returned error can't find the container with id 4b5785b317d5ca6368f1c79b70ee67c3f48131643dfdd643ba8b1d6edcfeb520 Feb 17 13:41:55 crc kubenswrapper[4804]: I0217 13:41:55.739162 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wrsrf" event={"ID":"ef60181c-19a6-454c-a197-2b0af0ac2edf","Type":"ContainerStarted","Data":"e250d36173e404a0945e19572b7e543c77be4f07a974ac9da4a8b694951defb3"} Feb 17 13:41:55 crc kubenswrapper[4804]: I0217 13:41:55.739503 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wrsrf" event={"ID":"ef60181c-19a6-454c-a197-2b0af0ac2edf","Type":"ContainerStarted","Data":"42c6823e8bc58c4e3d1883cbddac301df8020c586d1c3684c9adee09bcd76554"} Feb 17 13:41:55 crc kubenswrapper[4804]: I0217 13:41:55.739517 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wrsrf" event={"ID":"ef60181c-19a6-454c-a197-2b0af0ac2edf","Type":"ContainerStarted","Data":"4b5785b317d5ca6368f1c79b70ee67c3f48131643dfdd643ba8b1d6edcfeb520"} Feb 17 13:41:55 crc kubenswrapper[4804]: I0217 13:41:55.739904 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-wrsrf" Feb 17 13:41:55 crc kubenswrapper[4804]: I0217 13:41:55.765224 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-wrsrf" podStartSLOduration=3.7651908880000002 podStartE2EDuration="3.765190888s" podCreationTimestamp="2026-02-17 13:41:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:41:55.762985039 +0000 UTC m=+989.874404386" watchObservedRunningTime="2026-02-17 13:41:55.765190888 +0000 UTC m=+989.876610225" Feb 17 13:41:55 crc kubenswrapper[4804]: I0217 13:41:55.835911 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:41:55 crc kubenswrapper[4804]: I0217 13:41:55.835986 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:42:03 crc kubenswrapper[4804]: I0217 13:42:03.840472 4804 generic.go:334] "Generic (PLEG): container finished" podID="2cf110f6-e70a-45af-a634-744262733250" containerID="31a91661ad613675d481d75b9eb1010b10af1c6ecc85489ed53a7783cb7723a2" exitCode=0 Feb 17 13:42:03 crc kubenswrapper[4804]: I0217 13:42:03.840946 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ls9t" event={"ID":"2cf110f6-e70a-45af-a634-744262733250","Type":"ContainerDied","Data":"31a91661ad613675d481d75b9eb1010b10af1c6ecc85489ed53a7783cb7723a2"} Feb 17 13:42:03 crc kubenswrapper[4804]: I0217 13:42:03.843164 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" event={"ID":"0d003d1c-2370-4291-a035-0ebe8b97cfee","Type":"ContainerStarted","Data":"9cbd5bc77080f3176cb4b17720ecda8736208c9c38a000cb824af2f7c6983de3"} Feb 17 13:42:03 crc kubenswrapper[4804]: I0217 13:42:03.843331 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" Feb 17 13:42:03 crc kubenswrapper[4804]: I0217 13:42:03.886185 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" podStartSLOduration=3.550912601 podStartE2EDuration="11.886163077s" podCreationTimestamp="2026-02-17 13:41:52 +0000 UTC" firstStartedPulling="2026-02-17 13:41:54.374714387 +0000 UTC m=+988.486133734" lastFinishedPulling="2026-02-17 13:42:02.709964873 +0000 UTC m=+996.821384210" observedRunningTime="2026-02-17 13:42:03.881012247 +0000 UTC m=+997.992431584" watchObservedRunningTime="2026-02-17 13:42:03.886163077 +0000 UTC m=+997.997582424" Feb 17 13:42:04 crc kubenswrapper[4804]: I0217 13:42:04.853621 4804 generic.go:334] "Generic (PLEG): container finished" podID="2cf110f6-e70a-45af-a634-744262733250" containerID="286a37042ccecc7feae021e7d3d35a3a25c5461d1f7a95649e636d69ec398d1c" exitCode=0 Feb 17 13:42:04 crc kubenswrapper[4804]: I0217 13:42:04.853852 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ls9t" event={"ID":"2cf110f6-e70a-45af-a634-744262733250","Type":"ContainerDied","Data":"286a37042ccecc7feae021e7d3d35a3a25c5461d1f7a95649e636d69ec398d1c"} Feb 17 13:42:05 crc kubenswrapper[4804]: I0217 13:42:05.866280 4804 generic.go:334] "Generic (PLEG): container finished" podID="2cf110f6-e70a-45af-a634-744262733250" containerID="d183889706e6c6c384274a90a2714435f1701f94a432b836c8f9c14f439d512b" exitCode=0 Feb 17 13:42:05 crc kubenswrapper[4804]: I0217 13:42:05.866345 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ls9t" event={"ID":"2cf110f6-e70a-45af-a634-744262733250","Type":"ContainerDied","Data":"d183889706e6c6c384274a90a2714435f1701f94a432b836c8f9c14f439d512b"} Feb 17 13:42:06 crc kubenswrapper[4804]: I0217 13:42:06.878490 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ls9t" event={"ID":"2cf110f6-e70a-45af-a634-744262733250","Type":"ContainerStarted","Data":"8a5c15369198d50df5e850b53f78f17fbe8c70b3c65ec19fddcb9ee2117886ac"} Feb 17 13:42:06 crc kubenswrapper[4804]: I0217 13:42:06.878846 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ls9t" event={"ID":"2cf110f6-e70a-45af-a634-744262733250","Type":"ContainerStarted","Data":"a029044cfab50e217f50db0721984bbf20ff684705b44ebae942d30c10b54c68"} Feb 17 13:42:06 crc kubenswrapper[4804]: I0217 13:42:06.878860 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ls9t" event={"ID":"2cf110f6-e70a-45af-a634-744262733250","Type":"ContainerStarted","Data":"f9df17c01d36427bb662abaff76e285be5e09e55319387c314d10a038cca9e47"} Feb 17 13:42:06 crc kubenswrapper[4804]: I0217 13:42:06.878872 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ls9t" event={"ID":"2cf110f6-e70a-45af-a634-744262733250","Type":"ContainerStarted","Data":"774adec7ad193a8c0096330652f3c3ed1acee59c66563bc91df03ca73d822d7c"} Feb 17 13:42:06 crc kubenswrapper[4804]: I0217 13:42:06.878884 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ls9t" event={"ID":"2cf110f6-e70a-45af-a634-744262733250","Type":"ContainerStarted","Data":"a0535b85fbdd1d49661d3783776df228ddffff505a184606493b94f135df0702"} Feb 17 13:42:07 crc kubenswrapper[4804]: I0217 13:42:07.892700 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ls9t" event={"ID":"2cf110f6-e70a-45af-a634-744262733250","Type":"ContainerStarted","Data":"c8180c1f95b5adc892185b1b075d9d3853e0d01e17952fb55475654faebc2634"} Feb 17 13:42:07 crc kubenswrapper[4804]: I0217 13:42:07.893074 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:42:07 crc kubenswrapper[4804]: I0217 13:42:07.928070 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-5ls9t" podStartSLOduration=6.517898868 podStartE2EDuration="15.928046557s" podCreationTimestamp="2026-02-17 13:41:52 +0000 UTC" firstStartedPulling="2026-02-17 13:41:53.281784062 +0000 UTC m=+987.393203439" lastFinishedPulling="2026-02-17 13:42:02.691931791 +0000 UTC m=+996.803351128" observedRunningTime="2026-02-17 13:42:07.92171198 +0000 UTC m=+1002.033131397" watchObservedRunningTime="2026-02-17 13:42:07.928046557 +0000 UTC m=+1002.039465934" Feb 17 13:42:08 crc kubenswrapper[4804]: I0217 13:42:08.181907 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:42:08 crc kubenswrapper[4804]: I0217 13:42:08.220702 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:42:13 crc kubenswrapper[4804]: I0217 13:42:13.351952 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:42:13 crc kubenswrapper[4804]: I0217 13:42:13.497781 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" Feb 17 13:42:14 crc kubenswrapper[4804]: I0217 13:42:14.763326 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-wrsrf" Feb 17 13:42:20 crc kubenswrapper[4804]: I0217 13:42:20.925465 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-55nc6"] Feb 17 13:42:20 crc kubenswrapper[4804]: I0217 13:42:20.926958 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-55nc6" Feb 17 13:42:20 crc kubenswrapper[4804]: I0217 13:42:20.928596 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-79v2d" Feb 17 13:42:20 crc kubenswrapper[4804]: I0217 13:42:20.929278 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 17 13:42:20 crc kubenswrapper[4804]: I0217 13:42:20.933169 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 17 13:42:20 crc kubenswrapper[4804]: I0217 13:42:20.940689 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-55nc6"] Feb 17 13:42:21 crc kubenswrapper[4804]: I0217 13:42:21.014810 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmj92\" (UniqueName: \"kubernetes.io/projected/13d9e436-3cb0-4df0-aaf9-e614eba74c89-kube-api-access-cmj92\") pod \"openstack-operator-index-55nc6\" (UID: \"13d9e436-3cb0-4df0-aaf9-e614eba74c89\") " pod="openstack-operators/openstack-operator-index-55nc6" Feb 17 13:42:21 crc kubenswrapper[4804]: I0217 13:42:21.116075 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmj92\" (UniqueName: \"kubernetes.io/projected/13d9e436-3cb0-4df0-aaf9-e614eba74c89-kube-api-access-cmj92\") pod \"openstack-operator-index-55nc6\" (UID: \"13d9e436-3cb0-4df0-aaf9-e614eba74c89\") " pod="openstack-operators/openstack-operator-index-55nc6" Feb 17 13:42:21 crc kubenswrapper[4804]: I0217 13:42:21.142476 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmj92\" (UniqueName: \"kubernetes.io/projected/13d9e436-3cb0-4df0-aaf9-e614eba74c89-kube-api-access-cmj92\") pod \"openstack-operator-index-55nc6\" (UID: \"13d9e436-3cb0-4df0-aaf9-e614eba74c89\") " pod="openstack-operators/openstack-operator-index-55nc6" Feb 17 13:42:21 crc kubenswrapper[4804]: I0217 13:42:21.245473 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-55nc6" Feb 17 13:42:21 crc kubenswrapper[4804]: I0217 13:42:21.697923 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-55nc6"] Feb 17 13:42:21 crc kubenswrapper[4804]: I0217 13:42:21.997809 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-55nc6" event={"ID":"13d9e436-3cb0-4df0-aaf9-e614eba74c89","Type":"ContainerStarted","Data":"861d1c69d6a8221ff3032e9b5c4ea80bff43cdfd3c764102c5d643c7cc5ce89c"} Feb 17 13:42:23 crc kubenswrapper[4804]: I0217 13:42:23.184603 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:42:25 crc kubenswrapper[4804]: I0217 13:42:25.016142 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-55nc6" event={"ID":"13d9e436-3cb0-4df0-aaf9-e614eba74c89","Type":"ContainerStarted","Data":"a5f9f93ea4da96eee98bfd46ea36bb4e837a791f14d21668403ddf6cb911e961"} Feb 17 13:42:25 crc kubenswrapper[4804]: I0217 13:42:25.035436 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-55nc6" podStartSLOduration=2.307146188 podStartE2EDuration="5.035417136s" podCreationTimestamp="2026-02-17 13:42:20 +0000 UTC" firstStartedPulling="2026-02-17 13:42:21.709749956 +0000 UTC m=+1015.821169293" lastFinishedPulling="2026-02-17 13:42:24.438020914 +0000 UTC m=+1018.549440241" observedRunningTime="2026-02-17 13:42:25.031453272 +0000 UTC m=+1019.142872609" watchObservedRunningTime="2026-02-17 13:42:25.035417136 +0000 UTC m=+1019.146836493" Feb 17 13:42:25 crc kubenswrapper[4804]: I0217 13:42:25.835049 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:42:25 crc kubenswrapper[4804]: I0217 13:42:25.835122 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:42:31 crc kubenswrapper[4804]: I0217 13:42:31.246156 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-55nc6" Feb 17 13:42:31 crc kubenswrapper[4804]: I0217 13:42:31.247372 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-55nc6" Feb 17 13:42:31 crc kubenswrapper[4804]: I0217 13:42:31.268076 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-55nc6" Feb 17 13:42:32 crc kubenswrapper[4804]: I0217 13:42:32.154275 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-55nc6" Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.834971 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq"] Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.837656 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.841536 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq"] Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.841847 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-hjhwq" Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.858061 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtz5g\" (UniqueName: \"kubernetes.io/projected/fc2739bc-c729-4c9f-856b-9a08143fc359-kube-api-access-wtz5g\") pod \"69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.858136 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-util\") pod \"69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.858277 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-bundle\") pod \"69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.960080 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtz5g\" (UniqueName: \"kubernetes.io/projected/fc2739bc-c729-4c9f-856b-9a08143fc359-kube-api-access-wtz5g\") pod \"69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.960182 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-util\") pod \"69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.960274 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-bundle\") pod \"69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.961053 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-bundle\") pod \"69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.961254 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-util\") pod \"69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.983834 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtz5g\" (UniqueName: \"kubernetes.io/projected/fc2739bc-c729-4c9f-856b-9a08143fc359-kube-api-access-wtz5g\") pod \"69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:38 crc kubenswrapper[4804]: I0217 13:42:38.156927 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:38 crc kubenswrapper[4804]: I0217 13:42:38.619740 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq"] Feb 17 13:42:38 crc kubenswrapper[4804]: W0217 13:42:38.628163 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc2739bc_c729_4c9f_856b_9a08143fc359.slice/crio-ec7d0ee7f8c7dd1d27b502270de26941298513be5ef685346b7fd8df50fab105 WatchSource:0}: Error finding container ec7d0ee7f8c7dd1d27b502270de26941298513be5ef685346b7fd8df50fab105: Status 404 returned error can't find the container with id ec7d0ee7f8c7dd1d27b502270de26941298513be5ef685346b7fd8df50fab105 Feb 17 13:42:39 crc kubenswrapper[4804]: E0217 13:42:39.000762 4804 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc2739bc_c729_4c9f_856b_9a08143fc359.slice/crio-conmon-b15b4e31d6288ad5c02820211ee924900b2037c0db297292061f024011e20eb0.scope\": RecentStats: unable to find data in memory cache]" Feb 17 13:42:39 crc kubenswrapper[4804]: I0217 13:42:39.164698 4804 generic.go:334] "Generic (PLEG): container finished" podID="fc2739bc-c729-4c9f-856b-9a08143fc359" containerID="b15b4e31d6288ad5c02820211ee924900b2037c0db297292061f024011e20eb0" exitCode=0 Feb 17 13:42:39 crc kubenswrapper[4804]: I0217 13:42:39.164743 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" event={"ID":"fc2739bc-c729-4c9f-856b-9a08143fc359","Type":"ContainerDied","Data":"b15b4e31d6288ad5c02820211ee924900b2037c0db297292061f024011e20eb0"} Feb 17 13:42:39 crc kubenswrapper[4804]: I0217 13:42:39.164768 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" event={"ID":"fc2739bc-c729-4c9f-856b-9a08143fc359","Type":"ContainerStarted","Data":"ec7d0ee7f8c7dd1d27b502270de26941298513be5ef685346b7fd8df50fab105"} Feb 17 13:42:40 crc kubenswrapper[4804]: I0217 13:42:40.174590 4804 generic.go:334] "Generic (PLEG): container finished" podID="fc2739bc-c729-4c9f-856b-9a08143fc359" containerID="39408a9beaa99c4209df24118356ddcb6bea1315c96157a9d6f36ce235dcb210" exitCode=0 Feb 17 13:42:40 crc kubenswrapper[4804]: I0217 13:42:40.174881 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" event={"ID":"fc2739bc-c729-4c9f-856b-9a08143fc359","Type":"ContainerDied","Data":"39408a9beaa99c4209df24118356ddcb6bea1315c96157a9d6f36ce235dcb210"} Feb 17 13:42:41 crc kubenswrapper[4804]: I0217 13:42:41.184684 4804 generic.go:334] "Generic (PLEG): container finished" podID="fc2739bc-c729-4c9f-856b-9a08143fc359" containerID="aeb77e12a460173d6ba6457217034163673580ddc92d9330909fc77828824bae" exitCode=0 Feb 17 13:42:41 crc kubenswrapper[4804]: I0217 13:42:41.184738 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" event={"ID":"fc2739bc-c729-4c9f-856b-9a08143fc359","Type":"ContainerDied","Data":"aeb77e12a460173d6ba6457217034163673580ddc92d9330909fc77828824bae"} Feb 17 13:42:42 crc kubenswrapper[4804]: I0217 13:42:42.432019 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:42 crc kubenswrapper[4804]: I0217 13:42:42.622778 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-bundle\") pod \"fc2739bc-c729-4c9f-856b-9a08143fc359\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " Feb 17 13:42:42 crc kubenswrapper[4804]: I0217 13:42:42.623677 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtz5g\" (UniqueName: \"kubernetes.io/projected/fc2739bc-c729-4c9f-856b-9a08143fc359-kube-api-access-wtz5g\") pod \"fc2739bc-c729-4c9f-856b-9a08143fc359\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " Feb 17 13:42:42 crc kubenswrapper[4804]: I0217 13:42:42.623776 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-util\") pod \"fc2739bc-c729-4c9f-856b-9a08143fc359\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " Feb 17 13:42:42 crc kubenswrapper[4804]: I0217 13:42:42.623771 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-bundle" (OuterVolumeSpecName: "bundle") pod "fc2739bc-c729-4c9f-856b-9a08143fc359" (UID: "fc2739bc-c729-4c9f-856b-9a08143fc359"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:42:42 crc kubenswrapper[4804]: I0217 13:42:42.624452 4804 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:42:42 crc kubenswrapper[4804]: I0217 13:42:42.629864 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc2739bc-c729-4c9f-856b-9a08143fc359-kube-api-access-wtz5g" (OuterVolumeSpecName: "kube-api-access-wtz5g") pod "fc2739bc-c729-4c9f-856b-9a08143fc359" (UID: "fc2739bc-c729-4c9f-856b-9a08143fc359"). InnerVolumeSpecName "kube-api-access-wtz5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:42:42 crc kubenswrapper[4804]: I0217 13:42:42.638391 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-util" (OuterVolumeSpecName: "util") pod "fc2739bc-c729-4c9f-856b-9a08143fc359" (UID: "fc2739bc-c729-4c9f-856b-9a08143fc359"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:42:42 crc kubenswrapper[4804]: I0217 13:42:42.727062 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtz5g\" (UniqueName: \"kubernetes.io/projected/fc2739bc-c729-4c9f-856b-9a08143fc359-kube-api-access-wtz5g\") on node \"crc\" DevicePath \"\"" Feb 17 13:42:42 crc kubenswrapper[4804]: I0217 13:42:42.727125 4804 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-util\") on node \"crc\" DevicePath \"\"" Feb 17 13:42:43 crc kubenswrapper[4804]: I0217 13:42:43.199665 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" event={"ID":"fc2739bc-c729-4c9f-856b-9a08143fc359","Type":"ContainerDied","Data":"ec7d0ee7f8c7dd1d27b502270de26941298513be5ef685346b7fd8df50fab105"} Feb 17 13:42:43 crc kubenswrapper[4804]: I0217 13:42:43.199701 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec7d0ee7f8c7dd1d27b502270de26941298513be5ef685346b7fd8df50fab105" Feb 17 13:42:43 crc kubenswrapper[4804]: I0217 13:42:43.199754 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.667979 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x"] Feb 17 13:42:44 crc kubenswrapper[4804]: E0217 13:42:44.668258 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2739bc-c729-4c9f-856b-9a08143fc359" containerName="pull" Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.668271 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2739bc-c729-4c9f-856b-9a08143fc359" containerName="pull" Feb 17 13:42:44 crc kubenswrapper[4804]: E0217 13:42:44.668279 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2739bc-c729-4c9f-856b-9a08143fc359" containerName="util" Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.668285 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2739bc-c729-4c9f-856b-9a08143fc359" containerName="util" Feb 17 13:42:44 crc kubenswrapper[4804]: E0217 13:42:44.668305 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2739bc-c729-4c9f-856b-9a08143fc359" containerName="extract" Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.668311 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2739bc-c729-4c9f-856b-9a08143fc359" containerName="extract" Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.668410 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc2739bc-c729-4c9f-856b-9a08143fc359" containerName="extract" Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.668784 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x" Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.671652 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-vx6dt" Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.686461 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x"] Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.855457 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mrn7\" (UniqueName: \"kubernetes.io/projected/f69fc148-3a8b-4065-b075-85ecad8339e7-kube-api-access-6mrn7\") pod \"openstack-operator-controller-init-7cb8c4979f-kfx9x\" (UID: \"f69fc148-3a8b-4065-b075-85ecad8339e7\") " pod="openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x" Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.956771 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mrn7\" (UniqueName: \"kubernetes.io/projected/f69fc148-3a8b-4065-b075-85ecad8339e7-kube-api-access-6mrn7\") pod \"openstack-operator-controller-init-7cb8c4979f-kfx9x\" (UID: \"f69fc148-3a8b-4065-b075-85ecad8339e7\") " pod="openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x" Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.977251 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mrn7\" (UniqueName: \"kubernetes.io/projected/f69fc148-3a8b-4065-b075-85ecad8339e7-kube-api-access-6mrn7\") pod \"openstack-operator-controller-init-7cb8c4979f-kfx9x\" (UID: \"f69fc148-3a8b-4065-b075-85ecad8339e7\") " pod="openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x" Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.987985 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x" Feb 17 13:42:45 crc kubenswrapper[4804]: I0217 13:42:45.211669 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x"] Feb 17 13:42:45 crc kubenswrapper[4804]: W0217 13:42:45.217282 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf69fc148_3a8b_4065_b075_85ecad8339e7.slice/crio-fa0c77523b5532345c885b45d22ca5481ec80a1c8b9d6d7e446cb2a5faf48b1b WatchSource:0}: Error finding container fa0c77523b5532345c885b45d22ca5481ec80a1c8b9d6d7e446cb2a5faf48b1b: Status 404 returned error can't find the container with id fa0c77523b5532345c885b45d22ca5481ec80a1c8b9d6d7e446cb2a5faf48b1b Feb 17 13:42:46 crc kubenswrapper[4804]: I0217 13:42:46.218185 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x" event={"ID":"f69fc148-3a8b-4065-b075-85ecad8339e7","Type":"ContainerStarted","Data":"fa0c77523b5532345c885b45d22ca5481ec80a1c8b9d6d7e446cb2a5faf48b1b"} Feb 17 13:42:50 crc kubenswrapper[4804]: I0217 13:42:50.260311 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x" event={"ID":"f69fc148-3a8b-4065-b075-85ecad8339e7","Type":"ContainerStarted","Data":"078b6f1e61c73fda49819c4d9e4a1fb2d364c25e76601611efec9ae7181342b0"} Feb 17 13:42:50 crc kubenswrapper[4804]: I0217 13:42:50.260968 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x" Feb 17 13:42:50 crc kubenswrapper[4804]: I0217 13:42:50.296275 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x" podStartSLOduration=2.0937269020000002 podStartE2EDuration="6.296260945s" podCreationTimestamp="2026-02-17 13:42:44 +0000 UTC" firstStartedPulling="2026-02-17 13:42:45.22104713 +0000 UTC m=+1039.332466467" lastFinishedPulling="2026-02-17 13:42:49.423581173 +0000 UTC m=+1043.535000510" observedRunningTime="2026-02-17 13:42:50.296094331 +0000 UTC m=+1044.407513678" watchObservedRunningTime="2026-02-17 13:42:50.296260945 +0000 UTC m=+1044.407680282" Feb 17 13:42:54 crc kubenswrapper[4804]: I0217 13:42:54.990695 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x" Feb 17 13:42:55 crc kubenswrapper[4804]: I0217 13:42:55.835500 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:42:55 crc kubenswrapper[4804]: I0217 13:42:55.835816 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:42:55 crc kubenswrapper[4804]: I0217 13:42:55.835863 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:42:55 crc kubenswrapper[4804]: I0217 13:42:55.836426 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0320866c2bb2dbd13ef711a6f5701e23927765988c8998787dbdeb879aaaaa69"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 13:42:55 crc kubenswrapper[4804]: I0217 13:42:55.836485 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://0320866c2bb2dbd13ef711a6f5701e23927765988c8998787dbdeb879aaaaa69" gracePeriod=600 Feb 17 13:42:56 crc kubenswrapper[4804]: I0217 13:42:56.305741 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="0320866c2bb2dbd13ef711a6f5701e23927765988c8998787dbdeb879aaaaa69" exitCode=0 Feb 17 13:42:56 crc kubenswrapper[4804]: I0217 13:42:56.305785 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"0320866c2bb2dbd13ef711a6f5701e23927765988c8998787dbdeb879aaaaa69"} Feb 17 13:42:56 crc kubenswrapper[4804]: I0217 13:42:56.306001 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"ea496523757895e32a1aaa278701aae787ee0314e5bf63e36e8c688fc2dbc0d7"} Feb 17 13:42:56 crc kubenswrapper[4804]: I0217 13:42:56.306034 4804 scope.go:117] "RemoveContainer" containerID="8de55925453e9c90c2dd998f586db937cfd6d8bf2a763548f6a43c49f5395c8e" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.468795 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.470639 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.473652 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-zvk85" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.479420 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.480490 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.485506 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-4dhxq" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.490394 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.497972 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.545331 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.547075 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.549189 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxch4\" (UniqueName: \"kubernetes.io/projected/545c7d25-7774-4c62-89b8-f491fd4065e8-kube-api-access-xxch4\") pod \"barbican-operator-controller-manager-c4b7d6946-4xvfg\" (UID: \"545c7d25-7774-4c62-89b8-f491fd4065e8\") " pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.549355 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6npz\" (UniqueName: \"kubernetes.io/projected/0b746a42-c0b4-4cb9-9352-3623669bad5a-kube-api-access-t6npz\") pod \"cinder-operator-controller-manager-57746b5ff9-wn64m\" (UID: \"0b746a42-c0b4-4cb9-9352-3623669bad5a\") " pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.553631 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw82v\" (UniqueName: \"kubernetes.io/projected/fbc5e6cd-47c6-4199-a0f2-e4292a836fac-kube-api-access-qw82v\") pod \"designate-operator-controller-manager-55cc45767f-bslfv\" (UID: \"fbc5e6cd-47c6-4199-a0f2-e4292a836fac\") " pod="openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.552756 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-m6ftc" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.565726 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.578174 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.579097 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.585467 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-96794" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.588893 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.596403 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.599320 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.602410 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-q7f6c" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.617696 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.618645 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.624340 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-wxf2c" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.634152 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.635168 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.640117 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.643513 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.648815 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5zgr8" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.653886 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.656407 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw82v\" (UniqueName: \"kubernetes.io/projected/fbc5e6cd-47c6-4199-a0f2-e4292a836fac-kube-api-access-qw82v\") pod \"designate-operator-controller-manager-55cc45767f-bslfv\" (UID: \"fbc5e6cd-47c6-4199-a0f2-e4292a836fac\") " pod="openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.656500 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74pjq\" (UniqueName: \"kubernetes.io/projected/5fa66dc5-a518-40dd-a4b5-dd2b34425ad5-kube-api-access-74pjq\") pod \"horizon-operator-controller-manager-54fb488b88-t6hlr\" (UID: \"5fa66dc5-a518-40dd-a4b5-dd2b34425ad5\") " pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.656589 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.656657 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8dcg\" (UniqueName: \"kubernetes.io/projected/bf13099a-fbab-41bf-b30c-5c6b1049af19-kube-api-access-b8dcg\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.656726 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pmcl\" (UniqueName: \"kubernetes.io/projected/5796dc62-bd84-48b7-9c4c-7d5bf1f7e984-kube-api-access-6pmcl\") pod \"glance-operator-controller-manager-68c6d499cb-vt6zw\" (UID: \"5796dc62-bd84-48b7-9c4c-7d5bf1f7e984\") " pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.656860 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxch4\" (UniqueName: \"kubernetes.io/projected/545c7d25-7774-4c62-89b8-f491fd4065e8-kube-api-access-xxch4\") pod \"barbican-operator-controller-manager-c4b7d6946-4xvfg\" (UID: \"545c7d25-7774-4c62-89b8-f491fd4065e8\") " pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.656896 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs55s\" (UniqueName: \"kubernetes.io/projected/5727ae12-4720-4470-b5cc-8b8ae81c2af7-kube-api-access-qs55s\") pod \"heat-operator-controller-manager-9595d6797-sxtr2\" (UID: \"5727ae12-4720-4470-b5cc-8b8ae81c2af7\") " pod="openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.656936 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6npz\" (UniqueName: \"kubernetes.io/projected/0b746a42-c0b4-4cb9-9352-3623669bad5a-kube-api-access-t6npz\") pod \"cinder-operator-controller-manager-57746b5ff9-wn64m\" (UID: \"0b746a42-c0b4-4cb9-9352-3623669bad5a\") " pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.674847 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.675617 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.679594 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.690297 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.691496 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.704408 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.709215 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-pgk8x" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.710827 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-f5pbb" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.714261 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.715394 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw82v\" (UniqueName: \"kubernetes.io/projected/fbc5e6cd-47c6-4199-a0f2-e4292a836fac-kube-api-access-qw82v\") pod \"designate-operator-controller-manager-55cc45767f-bslfv\" (UID: \"fbc5e6cd-47c6-4199-a0f2-e4292a836fac\") " pod="openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.723775 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.724519 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.732871 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-7tl8t" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.737573 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6npz\" (UniqueName: \"kubernetes.io/projected/0b746a42-c0b4-4cb9-9352-3623669bad5a-kube-api-access-t6npz\") pod \"cinder-operator-controller-manager-57746b5ff9-wn64m\" (UID: \"0b746a42-c0b4-4cb9-9352-3623669bad5a\") " pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.744573 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.745948 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxch4\" (UniqueName: \"kubernetes.io/projected/545c7d25-7774-4c62-89b8-f491fd4065e8-kube-api-access-xxch4\") pod \"barbican-operator-controller-manager-c4b7d6946-4xvfg\" (UID: \"545c7d25-7774-4c62-89b8-f491fd4065e8\") " pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.750095 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.751092 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.779759 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-wg7gs" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.780350 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.780503 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8xc8\" (UniqueName: \"kubernetes.io/projected/430279ab-ba2f-4838-ab07-b851d4df84a0-kube-api-access-v8xc8\") pod \"keystone-operator-controller-manager-6c78d668d5-pddsh\" (UID: \"430279ab-ba2f-4838-ab07-b851d4df84a0\") " pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.780557 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs55s\" (UniqueName: \"kubernetes.io/projected/5727ae12-4720-4470-b5cc-8b8ae81c2af7-kube-api-access-qs55s\") pod \"heat-operator-controller-manager-9595d6797-sxtr2\" (UID: \"5727ae12-4720-4470-b5cc-8b8ae81c2af7\") " pod="openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.780588 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cst4d\" (UniqueName: \"kubernetes.io/projected/07b97973-fa08-4b79-9164-918a4d04f8b7-kube-api-access-cst4d\") pod \"ironic-operator-controller-manager-6494cdbf8f-cdpkr\" (UID: \"07b97973-fa08-4b79-9164-918a4d04f8b7\") " pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.780609 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74pjq\" (UniqueName: \"kubernetes.io/projected/5fa66dc5-a518-40dd-a4b5-dd2b34425ad5-kube-api-access-74pjq\") pod \"horizon-operator-controller-manager-54fb488b88-t6hlr\" (UID: \"5fa66dc5-a518-40dd-a4b5-dd2b34425ad5\") " pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.780673 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.780694 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw7hl\" (UniqueName: \"kubernetes.io/projected/d3332002-6930-418f-8288-e8344be70c6a-kube-api-access-mw7hl\") pod \"manila-operator-controller-manager-96fff9cb8-88sh4\" (UID: \"d3332002-6930-418f-8288-e8344be70c6a\") " pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.780719 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8dcg\" (UniqueName: \"kubernetes.io/projected/bf13099a-fbab-41bf-b30c-5c6b1049af19-kube-api-access-b8dcg\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.780749 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pmcl\" (UniqueName: \"kubernetes.io/projected/5796dc62-bd84-48b7-9c4c-7d5bf1f7e984-kube-api-access-6pmcl\") pod \"glance-operator-controller-manager-68c6d499cb-vt6zw\" (UID: \"5796dc62-bd84-48b7-9c4c-7d5bf1f7e984\") " pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.780767 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxvz2\" (UniqueName: \"kubernetes.io/projected/2546387a-6a42-4f8d-a321-2f9cbaa11adb-kube-api-access-bxvz2\") pod \"mariadb-operator-controller-manager-66997756f6-vkdg2\" (UID: \"2546387a-6a42-4f8d-a321-2f9cbaa11adb\") " pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2" Feb 17 13:43:15 crc kubenswrapper[4804]: E0217 13:43:15.781326 4804 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:15 crc kubenswrapper[4804]: E0217 13:43:15.781373 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert podName:bf13099a-fbab-41bf-b30c-5c6b1049af19 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:16.281356085 +0000 UTC m=+1070.392775422 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert") pod "infra-operator-controller-manager-66d6b5f488-lrjgg" (UID: "bf13099a-fbab-41bf-b30c-5c6b1049af19") : secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.781732 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.786233 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-4slfz" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.793478 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.798268 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.799369 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.806316 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.806927 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.826641 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-fgl99" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.840101 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pmcl\" (UniqueName: \"kubernetes.io/projected/5796dc62-bd84-48b7-9c4c-7d5bf1f7e984-kube-api-access-6pmcl\") pod \"glance-operator-controller-manager-68c6d499cb-vt6zw\" (UID: \"5796dc62-bd84-48b7-9c4c-7d5bf1f7e984\") " pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.840190 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.845290 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74pjq\" (UniqueName: \"kubernetes.io/projected/5fa66dc5-a518-40dd-a4b5-dd2b34425ad5-kube-api-access-74pjq\") pod \"horizon-operator-controller-manager-54fb488b88-t6hlr\" (UID: \"5fa66dc5-a518-40dd-a4b5-dd2b34425ad5\") " pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.846004 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs55s\" (UniqueName: \"kubernetes.io/projected/5727ae12-4720-4470-b5cc-8b8ae81c2af7-kube-api-access-qs55s\") pod \"heat-operator-controller-manager-9595d6797-sxtr2\" (UID: \"5727ae12-4720-4470-b5cc-8b8ae81c2af7\") " pod="openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.881502 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8dcg\" (UniqueName: \"kubernetes.io/projected/bf13099a-fbab-41bf-b30c-5c6b1049af19-kube-api-access-b8dcg\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.882071 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxvz2\" (UniqueName: \"kubernetes.io/projected/2546387a-6a42-4f8d-a321-2f9cbaa11adb-kube-api-access-bxvz2\") pod \"mariadb-operator-controller-manager-66997756f6-vkdg2\" (UID: \"2546387a-6a42-4f8d-a321-2f9cbaa11adb\") " pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.882140 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8xc8\" (UniqueName: \"kubernetes.io/projected/430279ab-ba2f-4838-ab07-b851d4df84a0-kube-api-access-v8xc8\") pod \"keystone-operator-controller-manager-6c78d668d5-pddsh\" (UID: \"430279ab-ba2f-4838-ab07-b851d4df84a0\") " pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.882184 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cst4d\" (UniqueName: \"kubernetes.io/projected/07b97973-fa08-4b79-9164-918a4d04f8b7-kube-api-access-cst4d\") pod \"ironic-operator-controller-manager-6494cdbf8f-cdpkr\" (UID: \"07b97973-fa08-4b79-9164-918a4d04f8b7\") " pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.882266 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw7hl\" (UniqueName: \"kubernetes.io/projected/d3332002-6930-418f-8288-e8344be70c6a-kube-api-access-mw7hl\") pod \"manila-operator-controller-manager-96fff9cb8-88sh4\" (UID: \"d3332002-6930-418f-8288-e8344be70c6a\") " pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.884970 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.900945 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.904121 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.912255 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.913044 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.916686 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.918934 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-xcktr" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.937225 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw7hl\" (UniqueName: \"kubernetes.io/projected/d3332002-6930-418f-8288-e8344be70c6a-kube-api-access-mw7hl\") pod \"manila-operator-controller-manager-96fff9cb8-88sh4\" (UID: \"d3332002-6930-418f-8288-e8344be70c6a\") " pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.946172 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8xc8\" (UniqueName: \"kubernetes.io/projected/430279ab-ba2f-4838-ab07-b851d4df84a0-kube-api-access-v8xc8\") pod \"keystone-operator-controller-manager-6c78d668d5-pddsh\" (UID: \"430279ab-ba2f-4838-ab07-b851d4df84a0\") " pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.948006 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxvz2\" (UniqueName: \"kubernetes.io/projected/2546387a-6a42-4f8d-a321-2f9cbaa11adb-kube-api-access-bxvz2\") pod \"mariadb-operator-controller-manager-66997756f6-vkdg2\" (UID: \"2546387a-6a42-4f8d-a321-2f9cbaa11adb\") " pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.960471 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.962640 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.971618 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.984398 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls4zb\" (UniqueName: \"kubernetes.io/projected/79eb8fb0-6207-44c8-b3c2-a00116bcf10b-kube-api-access-ls4zb\") pod \"octavia-operator-controller-manager-745bbbd77b-ptrs5\" (UID: \"79eb8fb0-6207-44c8-b3c2-a00116bcf10b\") " pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.984512 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stpzs\" (UniqueName: \"kubernetes.io/projected/36b1ca46-becb-417e-b05e-777d40246cb6-kube-api-access-stpzs\") pod \"nova-operator-controller-manager-5ddd85db87-c8hmm\" (UID: \"36b1ca46-becb-417e-b05e-777d40246cb6\") " pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.984612 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pcfl\" (UniqueName: \"kubernetes.io/projected/97925efc-eb46-4a60-b372-b31f13a2c876-kube-api-access-5pcfl\") pod \"neutron-operator-controller-manager-54967dbbdf-l5cl2\" (UID: \"97925efc-eb46-4a60-b372-b31f13a2c876\") " pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.988466 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:15.999845 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.008110 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cst4d\" (UniqueName: \"kubernetes.io/projected/07b97973-fa08-4b79-9164-918a4d04f8b7-kube-api-access-cst4d\") pod \"ironic-operator-controller-manager-6494cdbf8f-cdpkr\" (UID: \"07b97973-fa08-4b79-9164-918a4d04f8b7\") " pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.008203 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-pxc28" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.008489 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.044289 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.100925 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stpzs\" (UniqueName: \"kubernetes.io/projected/36b1ca46-becb-417e-b05e-777d40246cb6-kube-api-access-stpzs\") pod \"nova-operator-controller-manager-5ddd85db87-c8hmm\" (UID: \"36b1ca46-becb-417e-b05e-777d40246cb6\") " pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.101188 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pcfl\" (UniqueName: \"kubernetes.io/projected/97925efc-eb46-4a60-b372-b31f13a2c876-kube-api-access-5pcfl\") pod \"neutron-operator-controller-manager-54967dbbdf-l5cl2\" (UID: \"97925efc-eb46-4a60-b372-b31f13a2c876\") " pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.112627 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls4zb\" (UniqueName: \"kubernetes.io/projected/79eb8fb0-6207-44c8-b3c2-a00116bcf10b-kube-api-access-ls4zb\") pod \"octavia-operator-controller-manager-745bbbd77b-ptrs5\" (UID: \"79eb8fb0-6207-44c8-b3c2-a00116bcf10b\") " pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.119473 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.121860 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.139250 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.142694 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-dxk4c" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.160009 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.166336 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls4zb\" (UniqueName: \"kubernetes.io/projected/79eb8fb0-6207-44c8-b3c2-a00116bcf10b-kube-api-access-ls4zb\") pod \"octavia-operator-controller-manager-745bbbd77b-ptrs5\" (UID: \"79eb8fb0-6207-44c8-b3c2-a00116bcf10b\") " pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.168929 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pcfl\" (UniqueName: \"kubernetes.io/projected/97925efc-eb46-4a60-b372-b31f13a2c876-kube-api-access-5pcfl\") pod \"neutron-operator-controller-manager-54967dbbdf-l5cl2\" (UID: \"97925efc-eb46-4a60-b372-b31f13a2c876\") " pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.196998 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stpzs\" (UniqueName: \"kubernetes.io/projected/36b1ca46-becb-417e-b05e-777d40246cb6-kube-api-access-stpzs\") pod \"nova-operator-controller-manager-5ddd85db87-c8hmm\" (UID: \"36b1ca46-becb-417e-b05e-777d40246cb6\") " pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.217895 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl5dp\" (UniqueName: \"kubernetes.io/projected/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-kube-api-access-fl5dp\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.218037 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.218163 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.219174 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.222592 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.238509 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.247833 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-n2l94" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.279598 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.281475 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.282127 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.283805 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.286346 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.286656 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-kwvbt" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.287272 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.288675 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-x2tmn" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.297355 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.298162 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.298267 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.298387 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.299992 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-xjgzd" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.309751 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.312001 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.314982 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.315654 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.316894 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.317096 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-258zq" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.319023 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxsbs\" (UniqueName: \"kubernetes.io/projected/ac1e20c8-4527-4bba-85bd-2154e1244d3e-kube-api-access-jxsbs\") pod \"ovn-operator-controller-manager-85c99d655-ltwrc\" (UID: \"ac1e20c8-4527-4bba-85bd-2154e1244d3e\") " pod="openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.319171 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.319397 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl5dp\" (UniqueName: \"kubernetes.io/projected/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-kube-api-access-fl5dp\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.319450 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.319485 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xglv\" (UniqueName: \"kubernetes.io/projected/42505b9c-f878-4feb-b9a1-9dfa11ec0f56-kube-api-access-7xglv\") pod \"placement-operator-controller-manager-57bd55f9b7-9vbg5\" (UID: \"42505b9c-f878-4feb-b9a1-9dfa11ec0f56\") " pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" Feb 17 13:43:16 crc kubenswrapper[4804]: E0217 13:43:16.319642 4804 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:16 crc kubenswrapper[4804]: E0217 13:43:16.319685 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert podName:bf13099a-fbab-41bf-b30c-5c6b1049af19 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:17.319671294 +0000 UTC m=+1071.431090621 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert") pod "infra-operator-controller-manager-66d6b5f488-lrjgg" (UID: "bf13099a-fbab-41bf-b30c-5c6b1049af19") : secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:16 crc kubenswrapper[4804]: E0217 13:43:16.322285 4804 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:16 crc kubenswrapper[4804]: E0217 13:43:16.322352 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert podName:ae7598b8-fff5-4044-bbd7-0c8f2f60eed8 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:16.822341847 +0000 UTC m=+1070.933761174 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" (UID: "ae7598b8-fff5-4044-bbd7-0c8f2f60eed8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.338313 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.339705 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.342713 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.343065 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.343498 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vp69j" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.346778 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.347349 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.420566 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxsbs\" (UniqueName: \"kubernetes.io/projected/ac1e20c8-4527-4bba-85bd-2154e1244d3e-kube-api-access-jxsbs\") pod \"ovn-operator-controller-manager-85c99d655-ltwrc\" (UID: \"ac1e20c8-4527-4bba-85bd-2154e1244d3e\") " pod="openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.420818 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkwhq\" (UniqueName: \"kubernetes.io/projected/f94e791f-16fd-4364-a246-35bcca0d14e6-kube-api-access-rkwhq\") pod \"swift-operator-controller-manager-79558bbfbf-n6fl9\" (UID: \"f94e791f-16fd-4364-a246-35bcca0d14e6\") " pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.421069 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbrs6\" (UniqueName: \"kubernetes.io/projected/57038414-fcca-4a2a-8756-46f97cc57d81-kube-api-access-xbrs6\") pod \"watcher-operator-controller-manager-6c469bc6bb-xlwmb\" (UID: \"57038414-fcca-4a2a-8756-46f97cc57d81\") " pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.421184 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2mgs\" (UniqueName: \"kubernetes.io/projected/1c7ad838-6225-4001-899a-7f741cb75f2f-kube-api-access-x2mgs\") pod \"test-operator-controller-manager-8467ccb4c8-nwmk5\" (UID: \"1c7ad838-6225-4001-899a-7f741cb75f2f\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.421292 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rcn4\" (UniqueName: \"kubernetes.io/projected/8155784a-3945-4ca3-aa9a-b0e089ffac52-kube-api-access-8rcn4\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.421366 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xglv\" (UniqueName: \"kubernetes.io/projected/42505b9c-f878-4feb-b9a1-9dfa11ec0f56-kube-api-access-7xglv\") pod \"placement-operator-controller-manager-57bd55f9b7-9vbg5\" (UID: \"42505b9c-f878-4feb-b9a1-9dfa11ec0f56\") " pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.421928 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7bxk\" (UniqueName: \"kubernetes.io/projected/067b67c8-64c5-4c21-b1b1-770aa68e0eb7-kube-api-access-q7bxk\") pod \"telemetry-operator-controller-manager-56dc67d744-rbrxl\" (UID: \"067b67c8-64c5-4c21-b1b1-770aa68e0eb7\") " pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.422012 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.422116 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.538848 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl5dp\" (UniqueName: \"kubernetes.io/projected/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-kube-api-access-fl5dp\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.539488 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkwhq\" (UniqueName: \"kubernetes.io/projected/f94e791f-16fd-4364-a246-35bcca0d14e6-kube-api-access-rkwhq\") pod \"swift-operator-controller-manager-79558bbfbf-n6fl9\" (UID: \"f94e791f-16fd-4364-a246-35bcca0d14e6\") " pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.539539 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbrs6\" (UniqueName: \"kubernetes.io/projected/57038414-fcca-4a2a-8756-46f97cc57d81-kube-api-access-xbrs6\") pod \"watcher-operator-controller-manager-6c469bc6bb-xlwmb\" (UID: \"57038414-fcca-4a2a-8756-46f97cc57d81\") " pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.539582 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2mgs\" (UniqueName: \"kubernetes.io/projected/1c7ad838-6225-4001-899a-7f741cb75f2f-kube-api-access-x2mgs\") pod \"test-operator-controller-manager-8467ccb4c8-nwmk5\" (UID: \"1c7ad838-6225-4001-899a-7f741cb75f2f\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.539649 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rcn4\" (UniqueName: \"kubernetes.io/projected/8155784a-3945-4ca3-aa9a-b0e089ffac52-kube-api-access-8rcn4\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.539708 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7bxk\" (UniqueName: \"kubernetes.io/projected/067b67c8-64c5-4c21-b1b1-770aa68e0eb7-kube-api-access-q7bxk\") pod \"telemetry-operator-controller-manager-56dc67d744-rbrxl\" (UID: \"067b67c8-64c5-4c21-b1b1-770aa68e0eb7\") " pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.539731 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.539779 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:16 crc kubenswrapper[4804]: E0217 13:43:16.539935 4804 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 13:43:16 crc kubenswrapper[4804]: E0217 13:43:16.539994 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:17.039967808 +0000 UTC m=+1071.151387145 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "webhook-server-cert" not found Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.541759 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxsbs\" (UniqueName: \"kubernetes.io/projected/ac1e20c8-4527-4bba-85bd-2154e1244d3e-kube-api-access-jxsbs\") pod \"ovn-operator-controller-manager-85c99d655-ltwrc\" (UID: \"ac1e20c8-4527-4bba-85bd-2154e1244d3e\") " pod="openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc" Feb 17 13:43:16 crc kubenswrapper[4804]: E0217 13:43:16.543890 4804 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 13:43:16 crc kubenswrapper[4804]: E0217 13:43:16.543939 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:17.043923322 +0000 UTC m=+1071.155342659 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "metrics-server-cert" not found Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.557014 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.564276 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xglv\" (UniqueName: \"kubernetes.io/projected/42505b9c-f878-4feb-b9a1-9dfa11ec0f56-kube-api-access-7xglv\") pod \"placement-operator-controller-manager-57bd55f9b7-9vbg5\" (UID: \"42505b9c-f878-4feb-b9a1-9dfa11ec0f56\") " pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.573951 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7bxk\" (UniqueName: \"kubernetes.io/projected/067b67c8-64c5-4c21-b1b1-770aa68e0eb7-kube-api-access-q7bxk\") pod \"telemetry-operator-controller-manager-56dc67d744-rbrxl\" (UID: \"067b67c8-64c5-4c21-b1b1-770aa68e0eb7\") " pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.589279 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkwhq\" (UniqueName: \"kubernetes.io/projected/f94e791f-16fd-4364-a246-35bcca0d14e6-kube-api-access-rkwhq\") pod \"swift-operator-controller-manager-79558bbfbf-n6fl9\" (UID: \"f94e791f-16fd-4364-a246-35bcca0d14e6\") " pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.677101 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.713751 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2mgs\" (UniqueName: \"kubernetes.io/projected/1c7ad838-6225-4001-899a-7f741cb75f2f-kube-api-access-x2mgs\") pod \"test-operator-controller-manager-8467ccb4c8-nwmk5\" (UID: \"1c7ad838-6225-4001-899a-7f741cb75f2f\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.724471 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rcn4\" (UniqueName: \"kubernetes.io/projected/8155784a-3945-4ca3-aa9a-b0e089ffac52-kube-api-access-8rcn4\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.725177 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbrs6\" (UniqueName: \"kubernetes.io/projected/57038414-fcca-4a2a-8756-46f97cc57d81-kube-api-access-xbrs6\") pod \"watcher-operator-controller-manager-6c469bc6bb-xlwmb\" (UID: \"57038414-fcca-4a2a-8756-46f97cc57d81\") " pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.732478 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.733530 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.737477 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-sjp2t" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.750041 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.780569 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.848633 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28gfn\" (UniqueName: \"kubernetes.io/projected/44ec973d-9403-48f4-b92c-72f0bd485b0f-kube-api-access-28gfn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-rtlpm\" (UID: \"44ec973d-9403-48f4-b92c-72f0bd485b0f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.848739 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:16 crc kubenswrapper[4804]: E0217 13:43:16.848915 4804 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:16 crc kubenswrapper[4804]: E0217 13:43:16.848966 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert podName:ae7598b8-fff5-4044-bbd7-0c8f2f60eed8 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:17.848949531 +0000 UTC m=+1071.960368868 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" (UID: "ae7598b8-fff5-4044-bbd7-0c8f2f60eed8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.961897 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28gfn\" (UniqueName: \"kubernetes.io/projected/44ec973d-9403-48f4-b92c-72f0bd485b0f-kube-api-access-28gfn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-rtlpm\" (UID: \"44ec973d-9403-48f4-b92c-72f0bd485b0f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm" Feb 17 13:43:17 crc kubenswrapper[4804]: I0217 13:43:16.988658 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" Feb 17 13:43:17 crc kubenswrapper[4804]: I0217 13:43:17.005193 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28gfn\" (UniqueName: \"kubernetes.io/projected/44ec973d-9403-48f4-b92c-72f0bd485b0f-kube-api-access-28gfn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-rtlpm\" (UID: \"44ec973d-9403-48f4-b92c-72f0bd485b0f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm" Feb 17 13:43:17 crc kubenswrapper[4804]: I0217 13:43:17.063180 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:17 crc kubenswrapper[4804]: I0217 13:43:17.064279 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:17 crc kubenswrapper[4804]: E0217 13:43:17.063386 4804 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 13:43:17 crc kubenswrapper[4804]: E0217 13:43:17.064465 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:18.064446245 +0000 UTC m=+1072.175865572 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "metrics-server-cert" not found Feb 17 13:43:17 crc kubenswrapper[4804]: E0217 13:43:17.064707 4804 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 13:43:17 crc kubenswrapper[4804]: E0217 13:43:17.064790 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:18.064762695 +0000 UTC m=+1072.176182062 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "webhook-server-cert" not found Feb 17 13:43:17 crc kubenswrapper[4804]: I0217 13:43:17.101149 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" Feb 17 13:43:17 crc kubenswrapper[4804]: I0217 13:43:17.130172 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb" Feb 17 13:43:17 crc kubenswrapper[4804]: I0217 13:43:17.176253 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm" Feb 17 13:43:17 crc kubenswrapper[4804]: I0217 13:43:17.198610 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m"] Feb 17 13:43:17 crc kubenswrapper[4804]: I0217 13:43:17.373249 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:17 crc kubenswrapper[4804]: E0217 13:43:17.373693 4804 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:17 crc kubenswrapper[4804]: E0217 13:43:17.373762 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert podName:bf13099a-fbab-41bf-b30c-5c6b1049af19 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:19.373742527 +0000 UTC m=+1073.485161874 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert") pod "infra-operator-controller-manager-66d6b5f488-lrjgg" (UID: "bf13099a-fbab-41bf-b30c-5c6b1049af19") : secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:17 crc kubenswrapper[4804]: I0217 13:43:17.688461 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m" event={"ID":"0b746a42-c0b4-4cb9-9352-3623669bad5a","Type":"ContainerStarted","Data":"3e8f7e3b7ab6d784584525125ba04e2b9d6d38c51cfef5895e69d5253b26732a"} Feb 17 13:43:17 crc kubenswrapper[4804]: I0217 13:43:17.944000 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:17 crc kubenswrapper[4804]: E0217 13:43:17.944623 4804 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:17 crc kubenswrapper[4804]: E0217 13:43:17.944688 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert podName:ae7598b8-fff5-4044-bbd7-0c8f2f60eed8 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:19.944671063 +0000 UTC m=+1074.056090400 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" (UID: "ae7598b8-fff5-4044-bbd7-0c8f2f60eed8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.068770 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg"] Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.094567 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv"] Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.110951 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw"] Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.129351 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2"] Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.133626 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr"] Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.147454 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.147501 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.147660 4804 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.147789 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:20.147769611 +0000 UTC m=+1074.259188948 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "metrics-server-cert" not found Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.147709 4804 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.147953 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:20.147937626 +0000 UTC m=+1074.259356963 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "webhook-server-cert" not found Feb 17 13:43:18 crc kubenswrapper[4804]: W0217 13:43:18.155020 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod545c7d25_7774_4c62_89b8_f491fd4065e8.slice/crio-217dd5722838d18dc0b92ebff9346e9ccd105e8323acab78d6781707e0e1d62a WatchSource:0}: Error finding container 217dd5722838d18dc0b92ebff9346e9ccd105e8323acab78d6781707e0e1d62a: Status 404 returned error can't find the container with id 217dd5722838d18dc0b92ebff9346e9ccd105e8323acab78d6781707e0e1d62a Feb 17 13:43:18 crc kubenswrapper[4804]: W0217 13:43:18.177454 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07b97973_fa08_4b79_9164_918a4d04f8b7.slice/crio-0005e0fd91a1178d97f1bd1ad095edb4582b88a8045bf48a3fbcd86eaee62698 WatchSource:0}: Error finding container 0005e0fd91a1178d97f1bd1ad095edb4582b88a8045bf48a3fbcd86eaee62698: Status 404 returned error can't find the container with id 0005e0fd91a1178d97f1bd1ad095edb4582b88a8045bf48a3fbcd86eaee62698 Feb 17 13:43:18 crc kubenswrapper[4804]: W0217 13:43:18.178847 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2546387a_6a42_4f8d_a321_2f9cbaa11adb.slice/crio-51fb5ee19c31768668529187fabe1b7f060f4b662d5391b706263d9825cc905b WatchSource:0}: Error finding container 51fb5ee19c31768668529187fabe1b7f060f4b662d5391b706263d9825cc905b: Status 404 returned error can't find the container with id 51fb5ee19c31768668529187fabe1b7f060f4b662d5391b706263d9825cc905b Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.621879 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2"] Feb 17 13:43:18 crc kubenswrapper[4804]: W0217 13:43:18.622806 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36b1ca46_becb_417e_b05e_777d40246cb6.slice/crio-1648019a49a60d8da902c5bf2f03966ca9bb5a880fcd7309895d51d5affdfd80 WatchSource:0}: Error finding container 1648019a49a60d8da902c5bf2f03966ca9bb5a880fcd7309895d51d5affdfd80: Status 404 returned error can't find the container with id 1648019a49a60d8da902c5bf2f03966ca9bb5a880fcd7309895d51d5affdfd80 Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.624894 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm"] Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.636689 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5"] Feb 17 13:43:18 crc kubenswrapper[4804]: W0217 13:43:18.638372 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5727ae12_4720_4470_b5cc_8b8ae81c2af7.slice/crio-2e899f188f003d852e20990736276677358654887e92e6e0ccde9e969f2e64a1 WatchSource:0}: Error finding container 2e899f188f003d852e20990736276677358654887e92e6e0ccde9e969f2e64a1: Status 404 returned error can't find the container with id 2e899f188f003d852e20990736276677358654887e92e6e0ccde9e969f2e64a1 Feb 17 13:43:18 crc kubenswrapper[4804]: W0217 13:43:18.647914 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fa66dc5_a518_40dd_a4b5_dd2b34425ad5.slice/crio-6b86fad254a452e59d45d3b94a73064c4ff9d31ed6074284b6abf2be74051c78 WatchSource:0}: Error finding container 6b86fad254a452e59d45d3b94a73064c4ff9d31ed6074284b6abf2be74051c78: Status 404 returned error can't find the container with id 6b86fad254a452e59d45d3b94a73064c4ff9d31ed6074284b6abf2be74051c78 Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.648314 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2"] Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.659531 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4"] Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.663349 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc"] Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.674543 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl"] Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.680188 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:4b10e23983c3ec518c35aeabb33ac228063e56c81b4d7a100c5d91139ad7d7fc,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q7bxk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-56dc67d744-rbrxl_openstack-operators(067b67c8-64c5-4c21-b1b1-770aa68e0eb7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.681720 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" podUID="067b67c8-64c5-4c21-b1b1-770aa68e0eb7" Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.687999 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr"] Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.699269 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2" event={"ID":"2546387a-6a42-4f8d-a321-2f9cbaa11adb","Type":"ContainerStarted","Data":"51fb5ee19c31768668529187fabe1b7f060f4b662d5391b706263d9825cc905b"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.701668 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4" event={"ID":"d3332002-6930-418f-8288-e8344be70c6a","Type":"ContainerStarted","Data":"2815fe792194516ef3d3a5f1b6c97356f068723b239cf94254dbb30284d13940"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.703762 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr" event={"ID":"07b97973-fa08-4b79-9164-918a4d04f8b7","Type":"ContainerStarted","Data":"0005e0fd91a1178d97f1bd1ad095edb4582b88a8045bf48a3fbcd86eaee62698"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.717672 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg" event={"ID":"545c7d25-7774-4c62-89b8-f491fd4065e8","Type":"ContainerStarted","Data":"217dd5722838d18dc0b92ebff9346e9ccd105e8323acab78d6781707e0e1d62a"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.721684 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw" event={"ID":"5796dc62-bd84-48b7-9c4c-7d5bf1f7e984","Type":"ContainerStarted","Data":"81cdce91c558f3e8224d23b0239cfe80229f03f0d36b9ddf2558e90ffb069bbc"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.723371 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" event={"ID":"97925efc-eb46-4a60-b372-b31f13a2c876","Type":"ContainerStarted","Data":"27c9f34a05463bd2778237b8fd6f4dda2cee45edbef9f7b67c5cccf42c7bbe21"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.725085 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr" event={"ID":"5fa66dc5-a518-40dd-a4b5-dd2b34425ad5","Type":"ContainerStarted","Data":"6b86fad254a452e59d45d3b94a73064c4ff9d31ed6074284b6abf2be74051c78"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.726302 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" event={"ID":"067b67c8-64c5-4c21-b1b1-770aa68e0eb7","Type":"ContainerStarted","Data":"2ab8f06a4e2b6492a89d17a3df09697e6ee8018f5875feccae6f9451e8581f49"} Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.731115 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:4b10e23983c3ec518c35aeabb33ac228063e56c81b4d7a100c5d91139ad7d7fc\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" podUID="067b67c8-64c5-4c21-b1b1-770aa68e0eb7" Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.732869 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc" event={"ID":"ac1e20c8-4527-4bba-85bd-2154e1244d3e","Type":"ContainerStarted","Data":"2d9c0cbe835a0154cab35f0c29db27a3bdf2eb7f1561992745e2ba9de8a5ee03"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.735470 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" event={"ID":"79eb8fb0-6207-44c8-b3c2-a00116bcf10b","Type":"ContainerStarted","Data":"6683a351b12ca73497e3085ad06ce185a775b805f769957533ba381f18bcd2c4"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.736555 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" event={"ID":"36b1ca46-becb-417e-b05e-777d40246cb6","Type":"ContainerStarted","Data":"1648019a49a60d8da902c5bf2f03966ca9bb5a880fcd7309895d51d5affdfd80"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.739319 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv" event={"ID":"fbc5e6cd-47c6-4199-a0f2-e4292a836fac","Type":"ContainerStarted","Data":"e15006572fa3cb4d1a14e535ca276b22aad783c02dd25c35ee4843b763bc9e7d"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.750958 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2" event={"ID":"5727ae12-4720-4470-b5cc-8b8ae81c2af7","Type":"ContainerStarted","Data":"2e899f188f003d852e20990736276677358654887e92e6e0ccde9e969f2e64a1"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.798039 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9"] Feb 17 13:43:18 crc kubenswrapper[4804]: W0217 13:43:18.820209 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57038414_fcca_4a2a_8756_46f97cc57d81.slice/crio-e360f63e8932c95b8f6843bc5d8e4cce9e05e8bd0bbcf4cc347a7ba22f7318b4 WatchSource:0}: Error finding container e360f63e8932c95b8f6843bc5d8e4cce9e05e8bd0bbcf4cc347a7ba22f7318b4: Status 404 returned error can't find the container with id e360f63e8932c95b8f6843bc5d8e4cce9e05e8bd0bbcf4cc347a7ba22f7318b4 Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.824746 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5"] Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.832910 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x2mgs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-8467ccb4c8-nwmk5_openstack-operators(1c7ad838-6225-4001-899a-7f741cb75f2f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.833757 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5"] Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.834335 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" podUID="1c7ad838-6225-4001-899a-7f741cb75f2f" Feb 17 13:43:18 crc kubenswrapper[4804]: W0217 13:43:18.835465 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf94e791f_16fd_4364_a246_35bcca0d14e6.slice/crio-eb3564ae49e233544de0c48594532633248245ef6c8c47f3b8587ce82df00605 WatchSource:0}: Error finding container eb3564ae49e233544de0c48594532633248245ef6c8c47f3b8587ce82df00605: Status 404 returned error can't find the container with id eb3564ae49e233544de0c48594532633248245ef6c8c47f3b8587ce82df00605 Feb 17 13:43:18 crc kubenswrapper[4804]: W0217 13:43:18.835897 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42505b9c_f878_4feb_b9a1_9dfa11ec0f56.slice/crio-ab09d2a9c2aeba5acf347593b3d1b716bbafae068c0201dda744928e0ae0b090 WatchSource:0}: Error finding container ab09d2a9c2aeba5acf347593b3d1b716bbafae068c0201dda744928e0ae0b090: Status 404 returned error can't find the container with id ab09d2a9c2aeba5acf347593b3d1b716bbafae068c0201dda744928e0ae0b090 Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.840076 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:015f7f2d8b5afc85e51dd3b2e02a4cfb8294b543437315b291006d2416764db9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rkwhq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-79558bbfbf-n6fl9_openstack-operators(f94e791f-16fd-4364-a246-35bcca0d14e6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.840538 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d800f1288d1517d84a45ddd475c3c0b4e8686fd900c9edf1e20b662b15218b89,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7xglv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-57bd55f9b7-9vbg5_openstack-operators(42505b9c-f878-4feb-b9a1-9dfa11ec0f56): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.841462 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" podUID="f94e791f-16fd-4364-a246-35bcca0d14e6" Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.841975 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" podUID="42505b9c-f878-4feb-b9a1-9dfa11ec0f56" Feb 17 13:43:18 crc kubenswrapper[4804]: W0217 13:43:18.846186 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44ec973d_9403_48f4_b92c_72f0bd485b0f.slice/crio-4f04aa0e3d9c6e94cbb398e41a39e0969ccd1b6dd865984521f2b7ecdfdbd542 WatchSource:0}: Error finding container 4f04aa0e3d9c6e94cbb398e41a39e0969ccd1b6dd865984521f2b7ecdfdbd542: Status 404 returned error can't find the container with id 4f04aa0e3d9c6e94cbb398e41a39e0969ccd1b6dd865984521f2b7ecdfdbd542 Feb 17 13:43:18 crc kubenswrapper[4804]: W0217 13:43:18.850925 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod430279ab_ba2f_4838_ab07_b851d4df84a0.slice/crio-83263f215757a36f33b43ea13540e7c6e2ff9a55d4d0372fdc155b4f2a47cfa6 WatchSource:0}: Error finding container 83263f215757a36f33b43ea13540e7c6e2ff9a55d4d0372fdc155b4f2a47cfa6: Status 404 returned error can't find the container with id 83263f215757a36f33b43ea13540e7c6e2ff9a55d4d0372fdc155b4f2a47cfa6 Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.852285 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm"] Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.858215 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-28gfn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-rtlpm_openstack-operators(44ec973d-9403-48f4-b92c-72f0bd485b0f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.858690 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb"] Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.859398 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm" podUID="44ec973d-9403-48f4-b92c-72f0bd485b0f" Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.859763 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:9cb0b42ba1836ba4320a0a4660bfdeddea8c0685be379c0000dafb16398f4469,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v8xc8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-6c78d668d5-pddsh_openstack-operators(430279ab-ba2f-4838-ab07-b851d4df84a0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.860973 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" podUID="430279ab-ba2f-4838-ab07-b851d4df84a0" Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.865709 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh"] Feb 17 13:43:19 crc kubenswrapper[4804]: I0217 13:43:19.469371 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:19 crc kubenswrapper[4804]: E0217 13:43:19.469567 4804 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:19 crc kubenswrapper[4804]: E0217 13:43:19.469682 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert podName:bf13099a-fbab-41bf-b30c-5c6b1049af19 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:23.469651302 +0000 UTC m=+1077.581070679 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert") pod "infra-operator-controller-manager-66d6b5f488-lrjgg" (UID: "bf13099a-fbab-41bf-b30c-5c6b1049af19") : secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:19 crc kubenswrapper[4804]: I0217 13:43:19.766502 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm" event={"ID":"44ec973d-9403-48f4-b92c-72f0bd485b0f","Type":"ContainerStarted","Data":"4f04aa0e3d9c6e94cbb398e41a39e0969ccd1b6dd865984521f2b7ecdfdbd542"} Feb 17 13:43:19 crc kubenswrapper[4804]: I0217 13:43:19.767635 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb" event={"ID":"57038414-fcca-4a2a-8756-46f97cc57d81","Type":"ContainerStarted","Data":"e360f63e8932c95b8f6843bc5d8e4cce9e05e8bd0bbcf4cc347a7ba22f7318b4"} Feb 17 13:43:19 crc kubenswrapper[4804]: E0217 13:43:19.768881 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm" podUID="44ec973d-9403-48f4-b92c-72f0bd485b0f" Feb 17 13:43:19 crc kubenswrapper[4804]: I0217 13:43:19.769178 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" event={"ID":"1c7ad838-6225-4001-899a-7f741cb75f2f","Type":"ContainerStarted","Data":"95a722dfb74ea63952d9a887ace6ed84417ff6c0149e537bbb9344073c2146a8"} Feb 17 13:43:19 crc kubenswrapper[4804]: I0217 13:43:19.770385 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" event={"ID":"42505b9c-f878-4feb-b9a1-9dfa11ec0f56","Type":"ContainerStarted","Data":"ab09d2a9c2aeba5acf347593b3d1b716bbafae068c0201dda744928e0ae0b090"} Feb 17 13:43:19 crc kubenswrapper[4804]: E0217 13:43:19.772453 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d800f1288d1517d84a45ddd475c3c0b4e8686fd900c9edf1e20b662b15218b89\\\"\"" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" podUID="42505b9c-f878-4feb-b9a1-9dfa11ec0f56" Feb 17 13:43:19 crc kubenswrapper[4804]: E0217 13:43:19.772489 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27\\\"\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" podUID="1c7ad838-6225-4001-899a-7f741cb75f2f" Feb 17 13:43:19 crc kubenswrapper[4804]: I0217 13:43:19.772865 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" event={"ID":"430279ab-ba2f-4838-ab07-b851d4df84a0","Type":"ContainerStarted","Data":"83263f215757a36f33b43ea13540e7c6e2ff9a55d4d0372fdc155b4f2a47cfa6"} Feb 17 13:43:19 crc kubenswrapper[4804]: E0217 13:43:19.774178 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:9cb0b42ba1836ba4320a0a4660bfdeddea8c0685be379c0000dafb16398f4469\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" podUID="430279ab-ba2f-4838-ab07-b851d4df84a0" Feb 17 13:43:19 crc kubenswrapper[4804]: I0217 13:43:19.776834 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" event={"ID":"f94e791f-16fd-4364-a246-35bcca0d14e6","Type":"ContainerStarted","Data":"eb3564ae49e233544de0c48594532633248245ef6c8c47f3b8587ce82df00605"} Feb 17 13:43:19 crc kubenswrapper[4804]: E0217 13:43:19.779751 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:4b10e23983c3ec518c35aeabb33ac228063e56c81b4d7a100c5d91139ad7d7fc\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" podUID="067b67c8-64c5-4c21-b1b1-770aa68e0eb7" Feb 17 13:43:19 crc kubenswrapper[4804]: E0217 13:43:19.779780 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:015f7f2d8b5afc85e51dd3b2e02a4cfb8294b543437315b291006d2416764db9\\\"\"" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" podUID="f94e791f-16fd-4364-a246-35bcca0d14e6" Feb 17 13:43:19 crc kubenswrapper[4804]: I0217 13:43:19.976092 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:19 crc kubenswrapper[4804]: E0217 13:43:19.976267 4804 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:19 crc kubenswrapper[4804]: E0217 13:43:19.976335 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert podName:ae7598b8-fff5-4044-bbd7-0c8f2f60eed8 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:23.976316962 +0000 UTC m=+1078.087736299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" (UID: "ae7598b8-fff5-4044-bbd7-0c8f2f60eed8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:20 crc kubenswrapper[4804]: I0217 13:43:20.178481 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:20 crc kubenswrapper[4804]: I0217 13:43:20.178554 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:20 crc kubenswrapper[4804]: E0217 13:43:20.178735 4804 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 13:43:20 crc kubenswrapper[4804]: E0217 13:43:20.178779 4804 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 13:43:20 crc kubenswrapper[4804]: E0217 13:43:20.178810 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:24.17879087 +0000 UTC m=+1078.290210207 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "metrics-server-cert" not found Feb 17 13:43:20 crc kubenswrapper[4804]: E0217 13:43:20.178864 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:24.178843772 +0000 UTC m=+1078.290263109 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "webhook-server-cert" not found Feb 17 13:43:20 crc kubenswrapper[4804]: E0217 13:43:20.781995 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27\\\"\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" podUID="1c7ad838-6225-4001-899a-7f741cb75f2f" Feb 17 13:43:20 crc kubenswrapper[4804]: E0217 13:43:20.783499 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:015f7f2d8b5afc85e51dd3b2e02a4cfb8294b543437315b291006d2416764db9\\\"\"" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" podUID="f94e791f-16fd-4364-a246-35bcca0d14e6" Feb 17 13:43:20 crc kubenswrapper[4804]: E0217 13:43:20.783509 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm" podUID="44ec973d-9403-48f4-b92c-72f0bd485b0f" Feb 17 13:43:20 crc kubenswrapper[4804]: E0217 13:43:20.783555 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d800f1288d1517d84a45ddd475c3c0b4e8686fd900c9edf1e20b662b15218b89\\\"\"" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" podUID="42505b9c-f878-4feb-b9a1-9dfa11ec0f56" Feb 17 13:43:20 crc kubenswrapper[4804]: E0217 13:43:20.783558 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:9cb0b42ba1836ba4320a0a4660bfdeddea8c0685be379c0000dafb16398f4469\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" podUID="430279ab-ba2f-4838-ab07-b851d4df84a0" Feb 17 13:43:23 crc kubenswrapper[4804]: I0217 13:43:23.520956 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:23 crc kubenswrapper[4804]: E0217 13:43:23.521171 4804 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:23 crc kubenswrapper[4804]: E0217 13:43:23.521390 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert podName:bf13099a-fbab-41bf-b30c-5c6b1049af19 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:31.521372469 +0000 UTC m=+1085.632791806 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert") pod "infra-operator-controller-manager-66d6b5f488-lrjgg" (UID: "bf13099a-fbab-41bf-b30c-5c6b1049af19") : secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:24 crc kubenswrapper[4804]: I0217 13:43:24.035463 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:24 crc kubenswrapper[4804]: E0217 13:43:24.035708 4804 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:24 crc kubenswrapper[4804]: E0217 13:43:24.035765 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert podName:ae7598b8-fff5-4044-bbd7-0c8f2f60eed8 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:32.03574517 +0000 UTC m=+1086.147164517 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" (UID: "ae7598b8-fff5-4044-bbd7-0c8f2f60eed8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:24 crc kubenswrapper[4804]: I0217 13:43:24.239245 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:24 crc kubenswrapper[4804]: E0217 13:43:24.239402 4804 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 13:43:24 crc kubenswrapper[4804]: I0217 13:43:24.239426 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:24 crc kubenswrapper[4804]: E0217 13:43:24.239462 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:32.239445597 +0000 UTC m=+1086.350864934 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "webhook-server-cert" not found Feb 17 13:43:24 crc kubenswrapper[4804]: E0217 13:43:24.239528 4804 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 13:43:24 crc kubenswrapper[4804]: E0217 13:43:24.239562 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:32.23955047 +0000 UTC m=+1086.350969807 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "metrics-server-cert" not found Feb 17 13:43:31 crc kubenswrapper[4804]: I0217 13:43:31.576171 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:31 crc kubenswrapper[4804]: E0217 13:43:31.576559 4804 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:31 crc kubenswrapper[4804]: E0217 13:43:31.577443 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert podName:bf13099a-fbab-41bf-b30c-5c6b1049af19 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:47.577420323 +0000 UTC m=+1101.688839660 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert") pod "infra-operator-controller-manager-66d6b5f488-lrjgg" (UID: "bf13099a-fbab-41bf-b30c-5c6b1049af19") : secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:32 crc kubenswrapper[4804]: I0217 13:43:32.085458 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:32 crc kubenswrapper[4804]: E0217 13:43:32.085639 4804 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:32 crc kubenswrapper[4804]: E0217 13:43:32.085685 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert podName:ae7598b8-fff5-4044-bbd7-0c8f2f60eed8 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:48.085672393 +0000 UTC m=+1102.197091730 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" (UID: "ae7598b8-fff5-4044-bbd7-0c8f2f60eed8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:32 crc kubenswrapper[4804]: I0217 13:43:32.288939 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:32 crc kubenswrapper[4804]: E0217 13:43:32.289139 4804 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 13:43:32 crc kubenswrapper[4804]: E0217 13:43:32.289486 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:48.289459504 +0000 UTC m=+1102.400878891 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "webhook-server-cert" not found Feb 17 13:43:32 crc kubenswrapper[4804]: I0217 13:43:32.289654 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:32 crc kubenswrapper[4804]: E0217 13:43:32.289786 4804 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 13:43:32 crc kubenswrapper[4804]: E0217 13:43:32.289842 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:48.289831164 +0000 UTC m=+1102.401250561 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "metrics-server-cert" not found Feb 17 13:43:34 crc kubenswrapper[4804]: E0217 13:43:34.208581 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:8d65a2becf279bb8b6b1a09e273d9a2cb1ff41f85bc42ef2e4d573cbb8cbac89" Feb 17 13:43:34 crc kubenswrapper[4804]: E0217 13:43:34.209095 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:8d65a2becf279bb8b6b1a09e273d9a2cb1ff41f85bc42ef2e4d573cbb8cbac89,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5pcfl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-54967dbbdf-l5cl2_openstack-operators(97925efc-eb46-4a60-b372-b31f13a2c876): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:43:34 crc kubenswrapper[4804]: E0217 13:43:34.210289 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" podUID="97925efc-eb46-4a60-b372-b31f13a2c876" Feb 17 13:43:34 crc kubenswrapper[4804]: E0217 13:43:34.483247 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:8d65a2becf279bb8b6b1a09e273d9a2cb1ff41f85bc42ef2e4d573cbb8cbac89\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" podUID="97925efc-eb46-4a60-b372-b31f13a2c876" Feb 17 13:43:36 crc kubenswrapper[4804]: E0217 13:43:36.185613 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:3cba74378b21d22a9081b69a7547667220f090ae9281b2eabea35f91dfcf56c6" Feb 17 13:43:36 crc kubenswrapper[4804]: E0217 13:43:36.186060 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:3cba74378b21d22a9081b69a7547667220f090ae9281b2eabea35f91dfcf56c6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ls4zb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-745bbbd77b-ptrs5_openstack-operators(79eb8fb0-6207-44c8-b3c2-a00116bcf10b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:43:36 crc kubenswrapper[4804]: E0217 13:43:36.187346 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" podUID="79eb8fb0-6207-44c8-b3c2-a00116bcf10b" Feb 17 13:43:36 crc kubenswrapper[4804]: E0217 13:43:36.511619 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:3cba74378b21d22a9081b69a7547667220f090ae9281b2eabea35f91dfcf56c6\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" podUID="79eb8fb0-6207-44c8-b3c2-a00116bcf10b" Feb 17 13:43:36 crc kubenswrapper[4804]: E0217 13:43:36.826676 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:ab8e8207abec9cf5da7afded75ea76d1c3d2b9ab0f8e3124f518651e38f3123c" Feb 17 13:43:36 crc kubenswrapper[4804]: E0217 13:43:36.826878 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:ab8e8207abec9cf5da7afded75ea76d1c3d2b9ab0f8e3124f518651e38f3123c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-stpzs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5ddd85db87-c8hmm_openstack-operators(36b1ca46-becb-417e-b05e-777d40246cb6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:43:36 crc kubenswrapper[4804]: E0217 13:43:36.828040 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" podUID="36b1ca46-becb-417e-b05e-777d40246cb6" Feb 17 13:43:37 crc kubenswrapper[4804]: E0217 13:43:37.517850 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:ab8e8207abec9cf5da7afded75ea76d1c3d2b9ab0f8e3124f518651e38f3123c\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" podUID="36b1ca46-becb-417e-b05e-777d40246cb6" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.568007 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr" event={"ID":"5fa66dc5-a518-40dd-a4b5-dd2b34425ad5","Type":"ContainerStarted","Data":"5c89006876b8b1e9d3045d1c79368457d87477adb1bc57a36422c5e8a712da20"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.568653 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.570190 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" event={"ID":"1c7ad838-6225-4001-899a-7f741cb75f2f","Type":"ContainerStarted","Data":"3db8948ca335309c6c47d66c1a2e2d6297e85453fdbb017e59ccad0719040c41"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.570428 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.572276 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw" event={"ID":"5796dc62-bd84-48b7-9c4c-7d5bf1f7e984","Type":"ContainerStarted","Data":"54adeb351a52486e17abe0f9b50cf8c806b3e1c6423d9918147e06e66c6219e0"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.572444 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.574053 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm" event={"ID":"44ec973d-9403-48f4-b92c-72f0bd485b0f","Type":"ContainerStarted","Data":"76d1816c2540e7c49284c92d90af3e09d0423c04240e37c1b1e42048036c6ca7"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.575833 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4" event={"ID":"d3332002-6930-418f-8288-e8344be70c6a","Type":"ContainerStarted","Data":"7ab13d28c53db3b199a756c0355824c891323d454fc2a7fdf2a512223a96a156"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.575978 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.578007 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr" event={"ID":"07b97973-fa08-4b79-9164-918a4d04f8b7","Type":"ContainerStarted","Data":"3585c2a3514981cbd4f883a63b8bc41064406ea99f4aa4056357d4066270222b"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.578289 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.583474 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2" event={"ID":"5727ae12-4720-4470-b5cc-8b8ae81c2af7","Type":"ContainerStarted","Data":"74a6fd861a620cc6fb415cb407d6c1b308efc0162edb78887a75a978ed5fe3df"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.583635 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.587780 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc" event={"ID":"ac1e20c8-4527-4bba-85bd-2154e1244d3e","Type":"ContainerStarted","Data":"615769a1a7ba0a6e6a4183426261541178b669c0e5417273476ef73018dc5a3c"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.588609 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.600010 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" event={"ID":"430279ab-ba2f-4838-ab07-b851d4df84a0","Type":"ContainerStarted","Data":"1ac0307065a2b27a33baeb0e24a8999550a23391cd0ea6bfe01459914f545ad0"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.600367 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.607536 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" event={"ID":"f94e791f-16fd-4364-a246-35bcca0d14e6","Type":"ContainerStarted","Data":"1b75b7dae645d7cac6f38834036ab3236222319a525f659f325cad59a4c98e22"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.608123 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.610559 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2" event={"ID":"2546387a-6a42-4f8d-a321-2f9cbaa11adb","Type":"ContainerStarted","Data":"9e4be95bc3db4500b2c4afa726bc78655b58ec83c050ed3fb8500a036379b72f"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.610715 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.612305 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv" event={"ID":"fbc5e6cd-47c6-4199-a0f2-e4292a836fac","Type":"ContainerStarted","Data":"fa84bc9e07f4c4c884d0faf47857f1d82f0e0961c9ea342ff85a45e1df093286"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.612391 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.614008 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb" event={"ID":"57038414-fcca-4a2a-8756-46f97cc57d81","Type":"ContainerStarted","Data":"8f812716c35bb910bae6c3fc982b3867441f782036702db3c2c5967423b131f6"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.614055 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.614618 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr" podStartSLOduration=7.969450584 podStartE2EDuration="28.614599262s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.657353013 +0000 UTC m=+1072.768772350" lastFinishedPulling="2026-02-17 13:43:39.302501691 +0000 UTC m=+1093.413921028" observedRunningTime="2026-02-17 13:43:43.609598817 +0000 UTC m=+1097.721018164" watchObservedRunningTime="2026-02-17 13:43:43.614599262 +0000 UTC m=+1097.726018599" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.621119 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" event={"ID":"067b67c8-64c5-4c21-b1b1-770aa68e0eb7","Type":"ContainerStarted","Data":"4ad19de3ea4a63c8f85ec290f0edc9ffd4b5584dd043a4158cb9146bf539364d"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.621366 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.625480 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg" event={"ID":"545c7d25-7774-4c62-89b8-f491fd4065e8","Type":"ContainerStarted","Data":"54cd6a2d300e8d178521467d3e580af1070b9d46523c6c62e41113e2789e0f9e"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.625736 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.627434 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m" event={"ID":"0b746a42-c0b4-4cb9-9352-3623669bad5a","Type":"ContainerStarted","Data":"8df296497c51966ca58ffd39a4d292c89c9e09aa6f855a8ac799b7b96bf36135"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.627589 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.640317 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" event={"ID":"42505b9c-f878-4feb-b9a1-9dfa11ec0f56","Type":"ContainerStarted","Data":"988834749e5e8d25edd1dd777c56e64197708c6cdaa8c5720b0d7591ba0b80f2"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.641542 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.651029 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc" podStartSLOduration=7.087970768 podStartE2EDuration="28.651011729s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.669596185 +0000 UTC m=+1072.781015522" lastFinishedPulling="2026-02-17 13:43:40.232637146 +0000 UTC m=+1094.344056483" observedRunningTime="2026-02-17 13:43:43.648463989 +0000 UTC m=+1097.759883356" watchObservedRunningTime="2026-02-17 13:43:43.651011729 +0000 UTC m=+1097.762431066" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.671545 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2" podStartSLOduration=8.011535328 podStartE2EDuration="28.671524599s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.641582731 +0000 UTC m=+1072.753002068" lastFinishedPulling="2026-02-17 13:43:39.301571972 +0000 UTC m=+1093.412991339" observedRunningTime="2026-02-17 13:43:43.66960749 +0000 UTC m=+1097.781026827" watchObservedRunningTime="2026-02-17 13:43:43.671524599 +0000 UTC m=+1097.782943936" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.695711 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" podStartSLOduration=4.076119066 podStartE2EDuration="27.695691573s" podCreationTimestamp="2026-02-17 13:43:16 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.832390295 +0000 UTC m=+1072.943809632" lastFinishedPulling="2026-02-17 13:43:42.451962802 +0000 UTC m=+1096.563382139" observedRunningTime="2026-02-17 13:43:43.693389411 +0000 UTC m=+1097.804808748" watchObservedRunningTime="2026-02-17 13:43:43.695691573 +0000 UTC m=+1097.807110910" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.760760 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw" podStartSLOduration=7.628074881 podStartE2EDuration="28.760741933s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.168844688 +0000 UTC m=+1072.280264025" lastFinishedPulling="2026-02-17 13:43:39.3015117 +0000 UTC m=+1093.412931077" observedRunningTime="2026-02-17 13:43:43.759414812 +0000 UTC m=+1097.870834149" watchObservedRunningTime="2026-02-17 13:43:43.760741933 +0000 UTC m=+1097.872161270" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.800480 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr" podStartSLOduration=13.537631135 podStartE2EDuration="28.800458743s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.2140819 +0000 UTC m=+1072.325501237" lastFinishedPulling="2026-02-17 13:43:33.476909508 +0000 UTC m=+1087.588328845" observedRunningTime="2026-02-17 13:43:43.795930601 +0000 UTC m=+1097.907349938" watchObservedRunningTime="2026-02-17 13:43:43.800458743 +0000 UTC m=+1097.911878080" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.828672 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4" podStartSLOduration=8.196549562 podStartE2EDuration="28.828656133s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.669527523 +0000 UTC m=+1072.780946860" lastFinishedPulling="2026-02-17 13:43:39.301634044 +0000 UTC m=+1093.413053431" observedRunningTime="2026-02-17 13:43:43.823173352 +0000 UTC m=+1097.934592689" watchObservedRunningTime="2026-02-17 13:43:43.828656133 +0000 UTC m=+1097.940075460" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.859806 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" podStartSLOduration=5.297960109 podStartE2EDuration="28.859789784s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.839877019 +0000 UTC m=+1072.951296356" lastFinishedPulling="2026-02-17 13:43:42.401706684 +0000 UTC m=+1096.513126031" observedRunningTime="2026-02-17 13:43:43.852463685 +0000 UTC m=+1097.963883022" watchObservedRunningTime="2026-02-17 13:43:43.859789784 +0000 UTC m=+1097.971209121" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.915079 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" podStartSLOduration=5.445874915 podStartE2EDuration="28.915063689s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.859258964 +0000 UTC m=+1072.970678301" lastFinishedPulling="2026-02-17 13:43:42.328447738 +0000 UTC m=+1096.439867075" observedRunningTime="2026-02-17 13:43:43.911806177 +0000 UTC m=+1098.023225514" watchObservedRunningTime="2026-02-17 13:43:43.915063689 +0000 UTC m=+1098.026483026" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.935184 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm" podStartSLOduration=4.35692392 podStartE2EDuration="27.935165247s" podCreationTimestamp="2026-02-17 13:43:16 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.858024345 +0000 UTC m=+1072.969443682" lastFinishedPulling="2026-02-17 13:43:42.436265672 +0000 UTC m=+1096.547685009" observedRunningTime="2026-02-17 13:43:43.933126812 +0000 UTC m=+1098.044546149" watchObservedRunningTime="2026-02-17 13:43:43.935165247 +0000 UTC m=+1098.046584584" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.969851 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" podStartSLOduration=4.260325445 podStartE2EDuration="27.969824167s" podCreationTimestamp="2026-02-17 13:43:16 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.67999565 +0000 UTC m=+1072.791414987" lastFinishedPulling="2026-02-17 13:43:42.389494372 +0000 UTC m=+1096.500913709" observedRunningTime="2026-02-17 13:43:43.966465763 +0000 UTC m=+1098.077885100" watchObservedRunningTime="2026-02-17 13:43:43.969824167 +0000 UTC m=+1098.081243504" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.985483 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m" podStartSLOduration=15.776543852 podStartE2EDuration="28.985463876s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:17.268968358 +0000 UTC m=+1071.380387695" lastFinishedPulling="2026-02-17 13:43:30.477888382 +0000 UTC m=+1084.589307719" observedRunningTime="2026-02-17 13:43:43.983883476 +0000 UTC m=+1098.095302823" watchObservedRunningTime="2026-02-17 13:43:43.985463876 +0000 UTC m=+1098.096883213" Feb 17 13:43:44 crc kubenswrapper[4804]: I0217 13:43:44.013524 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg" podStartSLOduration=6.944299515 podStartE2EDuration="29.013501031s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.164295037 +0000 UTC m=+1072.275714374" lastFinishedPulling="2026-02-17 13:43:40.233496553 +0000 UTC m=+1094.344915890" observedRunningTime="2026-02-17 13:43:44.010537778 +0000 UTC m=+1098.121957115" watchObservedRunningTime="2026-02-17 13:43:44.013501031 +0000 UTC m=+1098.124920368" Feb 17 13:43:44 crc kubenswrapper[4804]: I0217 13:43:44.051100 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" podStartSLOduration=5.490696534 podStartE2EDuration="29.051076723s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.840031464 +0000 UTC m=+1072.951450801" lastFinishedPulling="2026-02-17 13:43:42.400411633 +0000 UTC m=+1096.511830990" observedRunningTime="2026-02-17 13:43:44.047179062 +0000 UTC m=+1098.158598399" watchObservedRunningTime="2026-02-17 13:43:44.051076723 +0000 UTC m=+1098.162496050" Feb 17 13:43:44 crc kubenswrapper[4804]: I0217 13:43:44.098147 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb" podStartSLOduration=7.620193813 podStartE2EDuration="28.098125222s" podCreationTimestamp="2026-02-17 13:43:16 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.823533479 +0000 UTC m=+1072.934952816" lastFinishedPulling="2026-02-17 13:43:39.301464868 +0000 UTC m=+1093.412884225" observedRunningTime="2026-02-17 13:43:44.076818746 +0000 UTC m=+1098.188238083" watchObservedRunningTime="2026-02-17 13:43:44.098125222 +0000 UTC m=+1098.209544559" Feb 17 13:43:44 crc kubenswrapper[4804]: I0217 13:43:44.140350 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2" podStartSLOduration=8.064613374 podStartE2EDuration="29.140332818s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.225779215 +0000 UTC m=+1072.337198552" lastFinishedPulling="2026-02-17 13:43:39.301498659 +0000 UTC m=+1093.412917996" observedRunningTime="2026-02-17 13:43:44.137074127 +0000 UTC m=+1098.248493464" watchObservedRunningTime="2026-02-17 13:43:44.140332818 +0000 UTC m=+1098.251752155" Feb 17 13:43:47 crc kubenswrapper[4804]: I0217 13:43:47.590806 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv" podStartSLOduration=17.354084359 podStartE2EDuration="32.590786442s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.240796164 +0000 UTC m=+1072.352215501" lastFinishedPulling="2026-02-17 13:43:33.477498247 +0000 UTC m=+1087.588917584" observedRunningTime="2026-02-17 13:43:44.191593448 +0000 UTC m=+1098.303012785" watchObservedRunningTime="2026-02-17 13:43:47.590786442 +0000 UTC m=+1101.702205789" Feb 17 13:43:47 crc kubenswrapper[4804]: I0217 13:43:47.675610 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:47 crc kubenswrapper[4804]: I0217 13:43:47.682907 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:47 crc kubenswrapper[4804]: I0217 13:43:47.894532 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5zgr8" Feb 17 13:43:47 crc kubenswrapper[4804]: I0217 13:43:47.903247 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.167043 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg"] Feb 17 13:43:48 crc kubenswrapper[4804]: W0217 13:43:48.174535 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf13099a_fbab_41bf_b30c_5c6b1049af19.slice/crio-d4777e2c877afdba03c760e23f0f6ea2cb34c789d130eacb89b2f570eda552a8 WatchSource:0}: Error finding container d4777e2c877afdba03c760e23f0f6ea2cb34c789d130eacb89b2f570eda552a8: Status 404 returned error can't find the container with id d4777e2c877afdba03c760e23f0f6ea2cb34c789d130eacb89b2f570eda552a8 Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.182011 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.190259 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.387044 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.387119 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.393186 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.393192 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.456992 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-pxc28" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.465705 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.668354 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vp69j" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.672798 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.693557 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" event={"ID":"bf13099a-fbab-41bf-b30c-5c6b1049af19","Type":"ContainerStarted","Data":"d4777e2c877afdba03c760e23f0f6ea2cb34c789d130eacb89b2f570eda552a8"} Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.706723 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" event={"ID":"79eb8fb0-6207-44c8-b3c2-a00116bcf10b","Type":"ContainerStarted","Data":"ba6bbf402b813b513db0f5658ae5782c38827546c0e2fdbcbc21df2da06cdb40"} Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.707114 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.708323 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" event={"ID":"97925efc-eb46-4a60-b372-b31f13a2c876","Type":"ContainerStarted","Data":"bdf336cc2fa5990df3f5147256d51e7daef34e921983c80e8bfca1deb02eeaf0"} Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.708687 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.794828 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" podStartSLOduration=4.331396857 podStartE2EDuration="33.794811755s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.648080173 +0000 UTC m=+1072.759499520" lastFinishedPulling="2026-02-17 13:43:48.111495071 +0000 UTC m=+1102.222914418" observedRunningTime="2026-02-17 13:43:48.771311451 +0000 UTC m=+1102.882730788" watchObservedRunningTime="2026-02-17 13:43:48.794811755 +0000 UTC m=+1102.906231092" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.800705 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" podStartSLOduration=4.352108483 podStartE2EDuration="33.800687038s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.665123375 +0000 UTC m=+1072.776542712" lastFinishedPulling="2026-02-17 13:43:48.11370193 +0000 UTC m=+1102.225121267" observedRunningTime="2026-02-17 13:43:48.792686089 +0000 UTC m=+1102.904105416" watchObservedRunningTime="2026-02-17 13:43:48.800687038 +0000 UTC m=+1102.912106375" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.933960 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88"] Feb 17 13:43:49 crc kubenswrapper[4804]: I0217 13:43:49.376964 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv"] Feb 17 13:43:49 crc kubenswrapper[4804]: I0217 13:43:49.720752 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" event={"ID":"8155784a-3945-4ca3-aa9a-b0e089ffac52","Type":"ContainerStarted","Data":"394848d12a2dc6457a84641291c2f16ae2b72738a3eddca98e7396c1d76144b7"} Feb 17 13:43:49 crc kubenswrapper[4804]: I0217 13:43:49.720792 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" event={"ID":"8155784a-3945-4ca3-aa9a-b0e089ffac52","Type":"ContainerStarted","Data":"37d16acca3dce20bf640eb09ad4a1d3b7c7f55ae3357d6c4c27ffc68339708a5"} Feb 17 13:43:49 crc kubenswrapper[4804]: I0217 13:43:49.720881 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:49 crc kubenswrapper[4804]: I0217 13:43:49.722282 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" event={"ID":"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8","Type":"ContainerStarted","Data":"8a63ef5863bc879a6e07d6b2c149cb1b992e69f989c426a9b5decb1ba81836be"} Feb 17 13:43:49 crc kubenswrapper[4804]: I0217 13:43:49.724056 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" event={"ID":"36b1ca46-becb-417e-b05e-777d40246cb6","Type":"ContainerStarted","Data":"ea6df02621ab67ca7f43e12e1b81f1bc560357f71096a1d2999dac2cb8ad5000"} Feb 17 13:43:49 crc kubenswrapper[4804]: I0217 13:43:49.724424 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" Feb 17 13:43:49 crc kubenswrapper[4804]: I0217 13:43:49.749361 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" podStartSLOduration=33.749346142 podStartE2EDuration="33.749346142s" podCreationTimestamp="2026-02-17 13:43:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:43:49.745706958 +0000 UTC m=+1103.857126295" watchObservedRunningTime="2026-02-17 13:43:49.749346142 +0000 UTC m=+1103.860765479" Feb 17 13:43:49 crc kubenswrapper[4804]: I0217 13:43:49.769309 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" podStartSLOduration=4.375647137 podStartE2EDuration="34.769288164s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.648049942 +0000 UTC m=+1072.759469279" lastFinishedPulling="2026-02-17 13:43:49.041690969 +0000 UTC m=+1103.153110306" observedRunningTime="2026-02-17 13:43:49.765850827 +0000 UTC m=+1103.877270154" watchObservedRunningTime="2026-02-17 13:43:49.769288164 +0000 UTC m=+1103.880707501" Feb 17 13:43:55 crc kubenswrapper[4804]: I0217 13:43:55.796967 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg" Feb 17 13:43:55 crc kubenswrapper[4804]: I0217 13:43:55.810339 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m" Feb 17 13:43:55 crc kubenswrapper[4804]: I0217 13:43:55.888449 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv" Feb 17 13:43:55 crc kubenswrapper[4804]: I0217 13:43:55.907046 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw" Feb 17 13:43:55 crc kubenswrapper[4804]: I0217 13:43:55.923850 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2" Feb 17 13:43:55 crc kubenswrapper[4804]: I0217 13:43:55.964542 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2" Feb 17 13:43:55 crc kubenswrapper[4804]: I0217 13:43:55.974889 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr" Feb 17 13:43:56 crc kubenswrapper[4804]: I0217 13:43:56.127553 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr" Feb 17 13:43:56 crc kubenswrapper[4804]: I0217 13:43:56.224961 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" Feb 17 13:43:56 crc kubenswrapper[4804]: I0217 13:43:56.242226 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4" Feb 17 13:43:56 crc kubenswrapper[4804]: I0217 13:43:56.284989 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" Feb 17 13:43:56 crc kubenswrapper[4804]: I0217 13:43:56.319615 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" Feb 17 13:43:56 crc kubenswrapper[4804]: I0217 13:43:56.355765 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" Feb 17 13:43:56 crc kubenswrapper[4804]: I0217 13:43:56.561305 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc" Feb 17 13:43:56 crc kubenswrapper[4804]: I0217 13:43:56.679437 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" Feb 17 13:43:56 crc kubenswrapper[4804]: I0217 13:43:56.782841 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" Feb 17 13:43:56 crc kubenswrapper[4804]: I0217 13:43:56.995857 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" Feb 17 13:43:57 crc kubenswrapper[4804]: I0217 13:43:57.103977 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" Feb 17 13:43:57 crc kubenswrapper[4804]: I0217 13:43:57.132970 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb" Feb 17 13:43:58 crc kubenswrapper[4804]: I0217 13:43:58.682365 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:44:00 crc kubenswrapper[4804]: E0217 13:44:00.234997 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:d8f38654cb385d3ff582419746c3d68d64c43cea412622f0e5dfcb32ee5ab47b" Feb 17 13:44:00 crc kubenswrapper[4804]: E0217 13:44:00.235708 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:d8f38654cb385d3ff582419746c3d68d64c43cea412622f0e5dfcb32ee5ab47b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:18.0-fr5-latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:18.0-fr5-latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:18.0-fr5-latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fl5dp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88_openstack-operators(ae7598b8-fff5-4044-bbd7-0c8f2f60eed8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:44:00 crc kubenswrapper[4804]: E0217 13:44:00.238621 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" podUID="ae7598b8-fff5-4044-bbd7-0c8f2f60eed8" Feb 17 13:44:00 crc kubenswrapper[4804]: E0217 13:44:00.750400 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:fc76cfd501345b5e18ddf48006aa04bcb4cb4020acd83894ed7c4fc952c0232a" Feb 17 13:44:00 crc kubenswrapper[4804]: E0217 13:44:00.750582 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:fc76cfd501345b5e18ddf48006aa04bcb4cb4020acd83894ed7c4fc952c0232a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b8dcg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-66d6b5f488-lrjgg_openstack-operators(bf13099a-fbab-41bf-b30c-5c6b1049af19): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:44:00 crc kubenswrapper[4804]: E0217 13:44:00.751977 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" podUID="bf13099a-fbab-41bf-b30c-5c6b1049af19" Feb 17 13:44:00 crc kubenswrapper[4804]: E0217 13:44:00.823033 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:fc76cfd501345b5e18ddf48006aa04bcb4cb4020acd83894ed7c4fc952c0232a\\\"\"" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" podUID="bf13099a-fbab-41bf-b30c-5c6b1049af19" Feb 17 13:44:00 crc kubenswrapper[4804]: E0217 13:44:00.824536 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:d8f38654cb385d3ff582419746c3d68d64c43cea412622f0e5dfcb32ee5ab47b\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" podUID="ae7598b8-fff5-4044-bbd7-0c8f2f60eed8" Feb 17 13:44:14 crc kubenswrapper[4804]: I0217 13:44:14.920395 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" event={"ID":"bf13099a-fbab-41bf-b30c-5c6b1049af19","Type":"ContainerStarted","Data":"146daf9882dc74acee691f78070232ae2e28634122daa5632d2d466b0cac1b7e"} Feb 17 13:44:14 crc kubenswrapper[4804]: I0217 13:44:14.921356 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:44:14 crc kubenswrapper[4804]: I0217 13:44:14.946564 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" podStartSLOduration=34.03729189 podStartE2EDuration="59.946533787s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:48.176680316 +0000 UTC m=+1102.288099653" lastFinishedPulling="2026-02-17 13:44:14.085922213 +0000 UTC m=+1128.197341550" observedRunningTime="2026-02-17 13:44:14.940603071 +0000 UTC m=+1129.052022418" watchObservedRunningTime="2026-02-17 13:44:14.946533787 +0000 UTC m=+1129.057953174" Feb 17 13:44:16 crc kubenswrapper[4804]: I0217 13:44:16.943833 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" event={"ID":"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8","Type":"ContainerStarted","Data":"02b984c9776b998e6e477d37b7d27a0322000916d31ba350356331a3fc9f3763"} Feb 17 13:44:16 crc kubenswrapper[4804]: I0217 13:44:16.944365 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:44:16 crc kubenswrapper[4804]: I0217 13:44:16.975764 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" podStartSLOduration=34.927563886 podStartE2EDuration="1m1.975739757s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:48.953403013 +0000 UTC m=+1103.064822350" lastFinishedPulling="2026-02-17 13:44:16.001578884 +0000 UTC m=+1130.112998221" observedRunningTime="2026-02-17 13:44:16.969526862 +0000 UTC m=+1131.080946239" watchObservedRunningTime="2026-02-17 13:44:16.975739757 +0000 UTC m=+1131.087159114" Feb 17 13:44:27 crc kubenswrapper[4804]: I0217 13:44:27.909690 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:44:28 crc kubenswrapper[4804]: I0217 13:44:28.472154 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:44:46 crc kubenswrapper[4804]: I0217 13:44:46.931172 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rftlc"] Feb 17 13:44:46 crc kubenswrapper[4804]: I0217 13:44:46.934008 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" Feb 17 13:44:46 crc kubenswrapper[4804]: I0217 13:44:46.940628 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 17 13:44:46 crc kubenswrapper[4804]: I0217 13:44:46.940930 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-sgq2d" Feb 17 13:44:46 crc kubenswrapper[4804]: I0217 13:44:46.941392 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 17 13:44:46 crc kubenswrapper[4804]: I0217 13:44:46.943115 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 17 13:44:46 crc kubenswrapper[4804]: I0217 13:44:46.946673 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rftlc"] Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.018075 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c8wp7"] Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.019218 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.022395 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.029751 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c8wp7"] Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.068322 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs5bs\" (UniqueName: \"kubernetes.io/projected/13452752-6880-43b4-9a63-8768d0afa122-kube-api-access-gs5bs\") pod \"dnsmasq-dns-675f4bcbfc-rftlc\" (UID: \"13452752-6880-43b4-9a63-8768d0afa122\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.068376 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13452752-6880-43b4-9a63-8768d0afa122-config\") pod \"dnsmasq-dns-675f4bcbfc-rftlc\" (UID: \"13452752-6880-43b4-9a63-8768d0afa122\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.169612 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-c8wp7\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.169709 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxjdq\" (UniqueName: \"kubernetes.io/projected/87f6a03c-039e-4107-985b-803f59ccfb89-kube-api-access-lxjdq\") pod \"dnsmasq-dns-78dd6ddcc-c8wp7\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.169754 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-config\") pod \"dnsmasq-dns-78dd6ddcc-c8wp7\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.169804 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs5bs\" (UniqueName: \"kubernetes.io/projected/13452752-6880-43b4-9a63-8768d0afa122-kube-api-access-gs5bs\") pod \"dnsmasq-dns-675f4bcbfc-rftlc\" (UID: \"13452752-6880-43b4-9a63-8768d0afa122\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.169863 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13452752-6880-43b4-9a63-8768d0afa122-config\") pod \"dnsmasq-dns-675f4bcbfc-rftlc\" (UID: \"13452752-6880-43b4-9a63-8768d0afa122\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.170875 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13452752-6880-43b4-9a63-8768d0afa122-config\") pod \"dnsmasq-dns-675f4bcbfc-rftlc\" (UID: \"13452752-6880-43b4-9a63-8768d0afa122\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.195484 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs5bs\" (UniqueName: \"kubernetes.io/projected/13452752-6880-43b4-9a63-8768d0afa122-kube-api-access-gs5bs\") pod \"dnsmasq-dns-675f4bcbfc-rftlc\" (UID: \"13452752-6880-43b4-9a63-8768d0afa122\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.251892 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.270607 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-c8wp7\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.270697 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxjdq\" (UniqueName: \"kubernetes.io/projected/87f6a03c-039e-4107-985b-803f59ccfb89-kube-api-access-lxjdq\") pod \"dnsmasq-dns-78dd6ddcc-c8wp7\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.270735 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-config\") pod \"dnsmasq-dns-78dd6ddcc-c8wp7\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.271742 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-config\") pod \"dnsmasq-dns-78dd6ddcc-c8wp7\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.271867 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-c8wp7\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.290724 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxjdq\" (UniqueName: \"kubernetes.io/projected/87f6a03c-039e-4107-985b-803f59ccfb89-kube-api-access-lxjdq\") pod \"dnsmasq-dns-78dd6ddcc-c8wp7\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.337953 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.787003 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rftlc"] Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.791160 4804 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.865127 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c8wp7"] Feb 17 13:44:47 crc kubenswrapper[4804]: W0217 13:44:47.872080 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87f6a03c_039e_4107_985b_803f59ccfb89.slice/crio-94f72eaaad772aaa7f7438e818024e90c7cbfcba278a93aab7cafc50db2475ad WatchSource:0}: Error finding container 94f72eaaad772aaa7f7438e818024e90c7cbfcba278a93aab7cafc50db2475ad: Status 404 returned error can't find the container with id 94f72eaaad772aaa7f7438e818024e90c7cbfcba278a93aab7cafc50db2475ad Feb 17 13:44:48 crc kubenswrapper[4804]: I0217 13:44:48.183161 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" event={"ID":"13452752-6880-43b4-9a63-8768d0afa122","Type":"ContainerStarted","Data":"0a2ffab0e99d6480ecf94c911214dd4efd07af4dfd4133a32e88b8a9e531736b"} Feb 17 13:44:48 crc kubenswrapper[4804]: I0217 13:44:48.184659 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" event={"ID":"87f6a03c-039e-4107-985b-803f59ccfb89","Type":"ContainerStarted","Data":"94f72eaaad772aaa7f7438e818024e90c7cbfcba278a93aab7cafc50db2475ad"} Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.673404 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rftlc"] Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.707036 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vrzhp"] Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.708117 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.752975 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vrzhp"] Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.819951 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv9pl\" (UniqueName: \"kubernetes.io/projected/8175f453-b68b-4236-844d-ff723515fe63-kube-api-access-kv9pl\") pod \"dnsmasq-dns-666b6646f7-vrzhp\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.820001 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-dns-svc\") pod \"dnsmasq-dns-666b6646f7-vrzhp\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.820023 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-config\") pod \"dnsmasq-dns-666b6646f7-vrzhp\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.921116 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-dns-svc\") pod \"dnsmasq-dns-666b6646f7-vrzhp\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.921162 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv9pl\" (UniqueName: \"kubernetes.io/projected/8175f453-b68b-4236-844d-ff723515fe63-kube-api-access-kv9pl\") pod \"dnsmasq-dns-666b6646f7-vrzhp\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.921183 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-config\") pod \"dnsmasq-dns-666b6646f7-vrzhp\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.922057 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-config\") pod \"dnsmasq-dns-666b6646f7-vrzhp\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.922221 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-dns-svc\") pod \"dnsmasq-dns-666b6646f7-vrzhp\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.965720 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv9pl\" (UniqueName: \"kubernetes.io/projected/8175f453-b68b-4236-844d-ff723515fe63-kube-api-access-kv9pl\") pod \"dnsmasq-dns-666b6646f7-vrzhp\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.038587 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.040770 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c8wp7"] Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.079863 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kqvs6"] Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.083397 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.099608 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kqvs6"] Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.228952 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-config\") pod \"dnsmasq-dns-57d769cc4f-kqvs6\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.229011 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zx6l\" (UniqueName: \"kubernetes.io/projected/3586301a-dce2-427b-b5c4-9376e59fbf27-kube-api-access-9zx6l\") pod \"dnsmasq-dns-57d769cc4f-kqvs6\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.229251 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-kqvs6\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.330115 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-config\") pod \"dnsmasq-dns-57d769cc4f-kqvs6\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.330544 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zx6l\" (UniqueName: \"kubernetes.io/projected/3586301a-dce2-427b-b5c4-9376e59fbf27-kube-api-access-9zx6l\") pod \"dnsmasq-dns-57d769cc4f-kqvs6\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.330648 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-kqvs6\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.331488 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-config\") pod \"dnsmasq-dns-57d769cc4f-kqvs6\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.331985 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-kqvs6\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.374155 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zx6l\" (UniqueName: \"kubernetes.io/projected/3586301a-dce2-427b-b5c4-9376e59fbf27-kube-api-access-9zx6l\") pod \"dnsmasq-dns-57d769cc4f-kqvs6\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.479136 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.500707 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vrzhp"] Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.864678 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.866098 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.869177 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.869254 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.869460 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.869608 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.869911 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-cxlcf" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.870070 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.870310 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.889978 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.042085 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.042146 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-config-data\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.042185 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.042236 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrqs7\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-kube-api-access-qrqs7\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.042273 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.042313 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.042343 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.042380 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7705a06d-bc27-4686-9ca4-4aae248ead07-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.042408 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7705a06d-bc27-4686-9ca4-4aae248ead07-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.042434 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.042458 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.121262 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kqvs6"] Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.143165 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.143930 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrqs7\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-kube-api-access-qrqs7\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.144035 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.144122 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.144298 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.144398 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7705a06d-bc27-4686-9ca4-4aae248ead07-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.144495 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7705a06d-bc27-4686-9ca4-4aae248ead07-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.145400 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.145518 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.145626 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.145852 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-config-data\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.146776 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-config-data\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.147033 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.147215 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.150882 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7705a06d-bc27-4686-9ca4-4aae248ead07-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.150937 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7705a06d-bc27-4686-9ca4-4aae248ead07-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.151354 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.151984 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.153128 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.154589 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.156851 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.191103 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrqs7\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-kube-api-access-qrqs7\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.222351 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.227500 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" event={"ID":"3586301a-dce2-427b-b5c4-9376e59fbf27","Type":"ContainerStarted","Data":"b6acb0860f5dd58b1333ac392aa371b170675172cb3eb7dbaaabc60cbdae0d1e"} Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.229074 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" event={"ID":"8175f453-b68b-4236-844d-ff723515fe63","Type":"ContainerStarted","Data":"6dec93dab248c776ff8091a9233f8da9e53443d47dfd060ebb89371b1dc81611"} Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.268689 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.282103 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.286858 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.289787 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.290312 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.290939 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.291052 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.291148 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.291243 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.291458 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-m99n4" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.352747 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.352810 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.352843 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxh4q\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-kube-api-access-cxh4q\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.352908 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.352938 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.352969 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.353007 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.353033 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.353069 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.353223 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.353405 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.455351 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.455693 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxh4q\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-kube-api-access-cxh4q\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.455777 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.455813 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.455853 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.455895 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.455924 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.455964 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.455987 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.456015 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.456401 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.456916 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.457008 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.457027 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.457359 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.457696 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.458341 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.468942 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.469414 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.470400 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.472141 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.478555 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxh4q\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-kube-api-access-cxh4q\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.485457 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.497429 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.624791 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.010461 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.198159 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.202217 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.203869 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.207545 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.208584 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-gf9w9" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.210038 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.214219 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.215779 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.270238 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.270300 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/49b02c8f-ff07-48f9-8012-e78dc6591499-config-data-default\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.270352 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/49b02c8f-ff07-48f9-8012-e78dc6591499-config-data-generated\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.270375 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/49b02c8f-ff07-48f9-8012-e78dc6591499-kolla-config\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.270393 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49b02c8f-ff07-48f9-8012-e78dc6591499-operator-scripts\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.270417 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbzzm\" (UniqueName: \"kubernetes.io/projected/49b02c8f-ff07-48f9-8012-e78dc6591499-kube-api-access-fbzzm\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.270435 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b02c8f-ff07-48f9-8012-e78dc6591499-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.270466 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b02c8f-ff07-48f9-8012-e78dc6591499-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.274125 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7705a06d-bc27-4686-9ca4-4aae248ead07","Type":"ContainerStarted","Data":"1805a02bed1d8e8fe42a7072ff53aa627c043f3fc1570707e67a0dbc0d5ed7c3"} Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.374195 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/49b02c8f-ff07-48f9-8012-e78dc6591499-config-data-default\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.374305 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/49b02c8f-ff07-48f9-8012-e78dc6591499-config-data-generated\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.374336 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/49b02c8f-ff07-48f9-8012-e78dc6591499-kolla-config\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.374354 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49b02c8f-ff07-48f9-8012-e78dc6591499-operator-scripts\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.374379 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbzzm\" (UniqueName: \"kubernetes.io/projected/49b02c8f-ff07-48f9-8012-e78dc6591499-kube-api-access-fbzzm\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.374405 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b02c8f-ff07-48f9-8012-e78dc6591499-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.374449 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b02c8f-ff07-48f9-8012-e78dc6591499-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.374478 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.375145 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.376951 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49b02c8f-ff07-48f9-8012-e78dc6591499-operator-scripts\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.378383 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/49b02c8f-ff07-48f9-8012-e78dc6591499-config-data-default\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.379016 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/49b02c8f-ff07-48f9-8012-e78dc6591499-kolla-config\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.382241 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/49b02c8f-ff07-48f9-8012-e78dc6591499-config-data-generated\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.397542 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b02c8f-ff07-48f9-8012-e78dc6591499-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.412998 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.424183 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b02c8f-ff07-48f9-8012-e78dc6591499-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.424303 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbzzm\" (UniqueName: \"kubernetes.io/projected/49b02c8f-ff07-48f9-8012-e78dc6591499-kube-api-access-fbzzm\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.466044 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: W0217 13:44:52.501638 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc485c5b_1bf7_473f_b5b0_a55d5dd0e2ad.slice/crio-3d33f0752018a1f8bfeaf3539a14d45be119615613d9a1b94e290b0a39b198ee WatchSource:0}: Error finding container 3d33f0752018a1f8bfeaf3539a14d45be119615613d9a1b94e290b0a39b198ee: Status 404 returned error can't find the container with id 3d33f0752018a1f8bfeaf3539a14d45be119615613d9a1b94e290b0a39b198ee Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.531928 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.207552 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.291397 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad","Type":"ContainerStarted","Data":"3d33f0752018a1f8bfeaf3539a14d45be119615613d9a1b94e290b0a39b198ee"} Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.701474 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.702583 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.705056 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-9cslz" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.705414 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.705539 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.716538 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.722116 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.760600 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.761953 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.765959 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-8zpnm" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.766158 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.766994 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.772062 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821464 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9eb8e8f-8bd1-4f69-84ee-27213046c709-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821521 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5ef96d0-19a6-4561-bde2-cf38e0280b39-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821546 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4tpq\" (UniqueName: \"kubernetes.io/projected/f9eb8e8f-8bd1-4f69-84ee-27213046c709-kube-api-access-s4tpq\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821563 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5ef96d0-19a6-4561-bde2-cf38e0280b39-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821644 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f9eb8e8f-8bd1-4f69-84ee-27213046c709-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821708 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7tbp\" (UniqueName: \"kubernetes.io/projected/f5ef96d0-19a6-4561-bde2-cf38e0280b39-kube-api-access-r7tbp\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821741 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9eb8e8f-8bd1-4f69-84ee-27213046c709-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821777 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821830 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5ef96d0-19a6-4561-bde2-cf38e0280b39-config-data\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821855 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5ef96d0-19a6-4561-bde2-cf38e0280b39-kolla-config\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821879 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f9eb8e8f-8bd1-4f69-84ee-27213046c709-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821922 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f9eb8e8f-8bd1-4f69-84ee-27213046c709-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821941 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9eb8e8f-8bd1-4f69-84ee-27213046c709-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.924554 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.924690 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5ef96d0-19a6-4561-bde2-cf38e0280b39-config-data\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.924722 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5ef96d0-19a6-4561-bde2-cf38e0280b39-kolla-config\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.925185 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f9eb8e8f-8bd1-4f69-84ee-27213046c709-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.925294 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f9eb8e8f-8bd1-4f69-84ee-27213046c709-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.925356 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9eb8e8f-8bd1-4f69-84ee-27213046c709-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.925601 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.926101 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5ef96d0-19a6-4561-bde2-cf38e0280b39-config-data\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.926504 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f9eb8e8f-8bd1-4f69-84ee-27213046c709-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.927178 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f9eb8e8f-8bd1-4f69-84ee-27213046c709-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.928687 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5ef96d0-19a6-4561-bde2-cf38e0280b39-kolla-config\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.931387 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9eb8e8f-8bd1-4f69-84ee-27213046c709-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.931506 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5ef96d0-19a6-4561-bde2-cf38e0280b39-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.932477 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4tpq\" (UniqueName: \"kubernetes.io/projected/f9eb8e8f-8bd1-4f69-84ee-27213046c709-kube-api-access-s4tpq\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.932511 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5ef96d0-19a6-4561-bde2-cf38e0280b39-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.932548 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f9eb8e8f-8bd1-4f69-84ee-27213046c709-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.932587 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7tbp\" (UniqueName: \"kubernetes.io/projected/f5ef96d0-19a6-4561-bde2-cf38e0280b39-kube-api-access-r7tbp\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.932622 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9eb8e8f-8bd1-4f69-84ee-27213046c709-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.934989 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9eb8e8f-8bd1-4f69-84ee-27213046c709-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.940105 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9eb8e8f-8bd1-4f69-84ee-27213046c709-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.943777 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5ef96d0-19a6-4561-bde2-cf38e0280b39-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.944770 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9eb8e8f-8bd1-4f69-84ee-27213046c709-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.949523 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f9eb8e8f-8bd1-4f69-84ee-27213046c709-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.960477 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5ef96d0-19a6-4561-bde2-cf38e0280b39-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.964096 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4tpq\" (UniqueName: \"kubernetes.io/projected/f9eb8e8f-8bd1-4f69-84ee-27213046c709-kube-api-access-s4tpq\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.967743 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7tbp\" (UniqueName: \"kubernetes.io/projected/f5ef96d0-19a6-4561-bde2-cf38e0280b39-kube-api-access-r7tbp\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.992785 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:54 crc kubenswrapper[4804]: I0217 13:44:54.034129 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:54 crc kubenswrapper[4804]: I0217 13:44:54.080544 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 17 13:44:56 crc kubenswrapper[4804]: I0217 13:44:56.101579 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 13:44:56 crc kubenswrapper[4804]: I0217 13:44:56.103081 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 13:44:56 crc kubenswrapper[4804]: I0217 13:44:56.113348 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-q5rh2" Feb 17 13:44:56 crc kubenswrapper[4804]: I0217 13:44:56.119023 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 13:44:56 crc kubenswrapper[4804]: I0217 13:44:56.178173 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncwbr\" (UniqueName: \"kubernetes.io/projected/cae6d84c-f65f-4ab2-a733-424ea34c680d-kube-api-access-ncwbr\") pod \"kube-state-metrics-0\" (UID: \"cae6d84c-f65f-4ab2-a733-424ea34c680d\") " pod="openstack/kube-state-metrics-0" Feb 17 13:44:56 crc kubenswrapper[4804]: I0217 13:44:56.279642 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncwbr\" (UniqueName: \"kubernetes.io/projected/cae6d84c-f65f-4ab2-a733-424ea34c680d-kube-api-access-ncwbr\") pod \"kube-state-metrics-0\" (UID: \"cae6d84c-f65f-4ab2-a733-424ea34c680d\") " pod="openstack/kube-state-metrics-0" Feb 17 13:44:56 crc kubenswrapper[4804]: I0217 13:44:56.303912 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncwbr\" (UniqueName: \"kubernetes.io/projected/cae6d84c-f65f-4ab2-a733-424ea34c680d-kube-api-access-ncwbr\") pod \"kube-state-metrics-0\" (UID: \"cae6d84c-f65f-4ab2-a733-424ea34c680d\") " pod="openstack/kube-state-metrics-0" Feb 17 13:44:56 crc kubenswrapper[4804]: I0217 13:44:56.429064 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.028273 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rzcfd"] Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.029846 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.032732 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-86dqn" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.033480 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.033836 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.048786 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rzcfd"] Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.093358 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-p4wrm"] Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.094968 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.121643 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-p4wrm"] Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140518 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-var-run\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140570 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xswhb\" (UniqueName: \"kubernetes.io/projected/9c049787-03d2-4679-8705-ec2cd1ad8141-kube-api-access-xswhb\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140634 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-etc-ovs\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140659 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c049787-03d2-4679-8705-ec2cd1ad8141-ovn-controller-tls-certs\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140681 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvqlc\" (UniqueName: \"kubernetes.io/projected/45330d20-989c-4507-ae57-5beaee075484-kube-api-access-fvqlc\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140719 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-var-lib\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140749 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45330d20-989c-4507-ae57-5beaee075484-scripts\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140781 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9c049787-03d2-4679-8705-ec2cd1ad8141-var-run\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140815 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c049787-03d2-4679-8705-ec2cd1ad8141-scripts\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140838 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c049787-03d2-4679-8705-ec2cd1ad8141-var-run-ovn\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140863 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-var-log\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140909 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9c049787-03d2-4679-8705-ec2cd1ad8141-var-log-ovn\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140933 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c049787-03d2-4679-8705-ec2cd1ad8141-combined-ca-bundle\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242276 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-var-lib\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242340 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45330d20-989c-4507-ae57-5beaee075484-scripts\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242380 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9c049787-03d2-4679-8705-ec2cd1ad8141-var-run\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242417 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c049787-03d2-4679-8705-ec2cd1ad8141-scripts\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242446 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c049787-03d2-4679-8705-ec2cd1ad8141-var-run-ovn\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242469 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-var-log\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242515 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9c049787-03d2-4679-8705-ec2cd1ad8141-var-log-ovn\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242548 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c049787-03d2-4679-8705-ec2cd1ad8141-combined-ca-bundle\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242588 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-var-run\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242616 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xswhb\" (UniqueName: \"kubernetes.io/projected/9c049787-03d2-4679-8705-ec2cd1ad8141-kube-api-access-xswhb\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242667 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-etc-ovs\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242691 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c049787-03d2-4679-8705-ec2cd1ad8141-ovn-controller-tls-certs\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242713 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvqlc\" (UniqueName: \"kubernetes.io/projected/45330d20-989c-4507-ae57-5beaee075484-kube-api-access-fvqlc\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.243946 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-var-lib\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.245737 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9c049787-03d2-4679-8705-ec2cd1ad8141-var-run\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.245738 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9c049787-03d2-4679-8705-ec2cd1ad8141-var-log-ovn\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.245769 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-var-run\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.245835 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-var-log\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.245894 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-etc-ovs\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.246040 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c049787-03d2-4679-8705-ec2cd1ad8141-var-run-ovn\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.246574 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45330d20-989c-4507-ae57-5beaee075484-scripts\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.247750 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c049787-03d2-4679-8705-ec2cd1ad8141-scripts\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.251161 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c049787-03d2-4679-8705-ec2cd1ad8141-combined-ca-bundle\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.262980 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c049787-03d2-4679-8705-ec2cd1ad8141-ovn-controller-tls-certs\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.264057 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvqlc\" (UniqueName: \"kubernetes.io/projected/45330d20-989c-4507-ae57-5beaee075484-kube-api-access-fvqlc\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.265958 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xswhb\" (UniqueName: \"kubernetes.io/projected/9c049787-03d2-4679-8705-ec2cd1ad8141-kube-api-access-xswhb\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.364530 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.416080 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.579622 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.581244 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.583654 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-gldrt" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.584328 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.584587 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.584651 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.585823 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.600058 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.648555 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0fc5c8da-b323-4afb-aa47-125fc63caefd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.648874 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhhrb\" (UniqueName: \"kubernetes.io/projected/0fc5c8da-b323-4afb-aa47-125fc63caefd-kube-api-access-rhhrb\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.648908 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0fc5c8da-b323-4afb-aa47-125fc63caefd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.648928 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc5c8da-b323-4afb-aa47-125fc63caefd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.648971 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc5c8da-b323-4afb-aa47-125fc63caefd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.648997 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc5c8da-b323-4afb-aa47-125fc63caefd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.649090 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fc5c8da-b323-4afb-aa47-125fc63caefd-config\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.649137 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.750488 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fc5c8da-b323-4afb-aa47-125fc63caefd-config\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.750545 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.750577 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0fc5c8da-b323-4afb-aa47-125fc63caefd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.750614 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhhrb\" (UniqueName: \"kubernetes.io/projected/0fc5c8da-b323-4afb-aa47-125fc63caefd-kube-api-access-rhhrb\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.750630 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0fc5c8da-b323-4afb-aa47-125fc63caefd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.750644 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc5c8da-b323-4afb-aa47-125fc63caefd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.750677 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc5c8da-b323-4afb-aa47-125fc63caefd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.750696 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc5c8da-b323-4afb-aa47-125fc63caefd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.751751 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.751782 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fc5c8da-b323-4afb-aa47-125fc63caefd-config\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.751908 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0fc5c8da-b323-4afb-aa47-125fc63caefd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.752727 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0fc5c8da-b323-4afb-aa47-125fc63caefd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.755925 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc5c8da-b323-4afb-aa47-125fc63caefd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.756571 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc5c8da-b323-4afb-aa47-125fc63caefd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.761118 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc5c8da-b323-4afb-aa47-125fc63caefd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.769433 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.771453 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhhrb\" (UniqueName: \"kubernetes.io/projected/0fc5c8da-b323-4afb-aa47-125fc63caefd-kube-api-access-rhhrb\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.896811 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.156936 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs"] Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.157903 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.161357 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.165520 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.175665 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs"] Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.262656 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-config-volume\") pod \"collect-profiles-29522265-8m8rs\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.262717 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qflrd\" (UniqueName: \"kubernetes.io/projected/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-kube-api-access-qflrd\") pod \"collect-profiles-29522265-8m8rs\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.262748 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-secret-volume\") pod \"collect-profiles-29522265-8m8rs\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.364372 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-config-volume\") pod \"collect-profiles-29522265-8m8rs\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.364447 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qflrd\" (UniqueName: \"kubernetes.io/projected/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-kube-api-access-qflrd\") pod \"collect-profiles-29522265-8m8rs\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.364477 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-secret-volume\") pod \"collect-profiles-29522265-8m8rs\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.365949 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-config-volume\") pod \"collect-profiles-29522265-8m8rs\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.367822 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-secret-volume\") pod \"collect-profiles-29522265-8m8rs\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.404318 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qflrd\" (UniqueName: \"kubernetes.io/projected/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-kube-api-access-qflrd\") pod \"collect-profiles-29522265-8m8rs\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:00 crc kubenswrapper[4804]: W0217 13:45:00.432512 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49b02c8f_ff07_48f9_8012_e78dc6591499.slice/crio-7aa183590318b7c7ef0f9769c4b2d764a25c9c29b68ca5709e3ccc7e86592145 WatchSource:0}: Error finding container 7aa183590318b7c7ef0f9769c4b2d764a25c9c29b68ca5709e3ccc7e86592145: Status 404 returned error can't find the container with id 7aa183590318b7c7ef0f9769c4b2d764a25c9c29b68ca5709e3ccc7e86592145 Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.482304 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:01 crc kubenswrapper[4804]: I0217 13:45:01.358851 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"49b02c8f-ff07-48f9-8012-e78dc6591499","Type":"ContainerStarted","Data":"7aa183590318b7c7ef0f9769c4b2d764a25c9c29b68ca5709e3ccc7e86592145"} Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.613000 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.616368 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.623976 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.624277 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-mnfdx" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.624505 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.624695 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.646683 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.727811 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/10e1124a-f402-422d-a906-8d22c90d4abe-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.727874 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10e1124a-f402-422d-a906-8d22c90d4abe-config\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.727999 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.728048 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/10e1124a-f402-422d-a906-8d22c90d4abe-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.728075 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/10e1124a-f402-422d-a906-8d22c90d4abe-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.728095 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vk87\" (UniqueName: \"kubernetes.io/projected/10e1124a-f402-422d-a906-8d22c90d4abe-kube-api-access-4vk87\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.728169 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e1124a-f402-422d-a906-8d22c90d4abe-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.728230 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10e1124a-f402-422d-a906-8d22c90d4abe-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.829417 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.829519 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/10e1124a-f402-422d-a906-8d22c90d4abe-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.829560 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/10e1124a-f402-422d-a906-8d22c90d4abe-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.829586 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vk87\" (UniqueName: \"kubernetes.io/projected/10e1124a-f402-422d-a906-8d22c90d4abe-kube-api-access-4vk87\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.829649 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e1124a-f402-422d-a906-8d22c90d4abe-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.829678 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10e1124a-f402-422d-a906-8d22c90d4abe-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.829731 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/10e1124a-f402-422d-a906-8d22c90d4abe-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.829773 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10e1124a-f402-422d-a906-8d22c90d4abe-config\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.830853 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10e1124a-f402-422d-a906-8d22c90d4abe-config\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.831251 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.832474 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/10e1124a-f402-422d-a906-8d22c90d4abe-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.833501 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10e1124a-f402-422d-a906-8d22c90d4abe-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.841315 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/10e1124a-f402-422d-a906-8d22c90d4abe-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.843996 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/10e1124a-f402-422d-a906-8d22c90d4abe-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.844051 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e1124a-f402-422d-a906-8d22c90d4abe-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.855447 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.855596 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vk87\" (UniqueName: \"kubernetes.io/projected/10e1124a-f402-422d-a906-8d22c90d4abe-kube-api-access-4vk87\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.952829 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:07 crc kubenswrapper[4804]: E0217 13:45:07.544622 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Feb 17 13:45:07 crc kubenswrapper[4804]: E0217 13:45:07.545445 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qrqs7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(7705a06d-bc27-4686-9ca4-4aae248ead07): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:45:07 crc kubenswrapper[4804]: E0217 13:45:07.546708 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="7705a06d-bc27-4686-9ca4-4aae248ead07" Feb 17 13:45:08 crc kubenswrapper[4804]: E0217 13:45:08.415299 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="7705a06d-bc27-4686-9ca4-4aae248ead07" Feb 17 13:45:11 crc kubenswrapper[4804]: I0217 13:45:11.869701 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs"] Feb 17 13:45:13 crc kubenswrapper[4804]: W0217 13:45:13.862963 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc77ee5ee_2b38_4a70_bc28_e2cdf625ab1f.slice/crio-004599c68202e5bb23f471139692d2f94e213853179d88384fbc6aa468c034db WatchSource:0}: Error finding container 004599c68202e5bb23f471139692d2f94e213853179d88384fbc6aa468c034db: Status 404 returned error can't find the container with id 004599c68202e5bb23f471139692d2f94e213853179d88384fbc6aa468c034db Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.918369 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.918672 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gs5bs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-rftlc_openstack(13452752-6880-43b4-9a63-8768d0afa122): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.919881 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" podUID="13452752-6880-43b4-9a63-8768d0afa122" Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.953427 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.954044 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9zx6l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-kqvs6_openstack(3586301a-dce2-427b-b5c4-9376e59fbf27): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.955759 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" podUID="3586301a-dce2-427b-b5c4-9376e59fbf27" Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.973576 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.973821 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lxjdq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-c8wp7_openstack(87f6a03c-039e-4107-985b-803f59ccfb89): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.975348 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" podUID="87f6a03c-039e-4107-985b-803f59ccfb89" Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.987023 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.987241 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kv9pl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-vrzhp_openstack(8175f453-b68b-4236-844d-ff723515fe63): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.988481 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" podUID="8175f453-b68b-4236-844d-ff723515fe63" Feb 17 13:45:14 crc kubenswrapper[4804]: I0217 13:45:14.341693 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 17 13:45:14 crc kubenswrapper[4804]: W0217 13:45:14.347884 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5ef96d0_19a6_4561_bde2_cf38e0280b39.slice/crio-819af1b6546c0ce28efac1d1b94b84f1e414aa1d8f9cb15d012ca3352c3f5f7c WatchSource:0}: Error finding container 819af1b6546c0ce28efac1d1b94b84f1e414aa1d8f9cb15d012ca3352c3f5f7c: Status 404 returned error can't find the container with id 819af1b6546c0ce28efac1d1b94b84f1e414aa1d8f9cb15d012ca3352c3f5f7c Feb 17 13:45:14 crc kubenswrapper[4804]: I0217 13:45:14.459680 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f5ef96d0-19a6-4561-bde2-cf38e0280b39","Type":"ContainerStarted","Data":"819af1b6546c0ce28efac1d1b94b84f1e414aa1d8f9cb15d012ca3352c3f5f7c"} Feb 17 13:45:14 crc kubenswrapper[4804]: I0217 13:45:14.462316 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" event={"ID":"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f","Type":"ContainerStarted","Data":"96261c5dff8beaf5a66244a0c5555316f61e48042e355a630a22cedfabc69568"} Feb 17 13:45:14 crc kubenswrapper[4804]: I0217 13:45:14.462357 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" event={"ID":"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f","Type":"ContainerStarted","Data":"004599c68202e5bb23f471139692d2f94e213853179d88384fbc6aa468c034db"} Feb 17 13:45:14 crc kubenswrapper[4804]: I0217 13:45:14.465991 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"49b02c8f-ff07-48f9-8012-e78dc6591499","Type":"ContainerStarted","Data":"b93a15f86d51cc28e40802669fc1dc0ee030c02a56e4690a974969a6a5e38c99"} Feb 17 13:45:14 crc kubenswrapper[4804]: E0217 13:45:14.468227 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" podUID="3586301a-dce2-427b-b5c4-9376e59fbf27" Feb 17 13:45:14 crc kubenswrapper[4804]: E0217 13:45:14.468449 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" podUID="8175f453-b68b-4236-844d-ff723515fe63" Feb 17 13:45:14 crc kubenswrapper[4804]: I0217 13:45:14.549545 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rzcfd"] Feb 17 13:45:14 crc kubenswrapper[4804]: W0217 13:45:14.554996 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c049787_03d2_4679_8705_ec2cd1ad8141.slice/crio-6ea55292565299fc6b077dff53f76fc89ceb245b582b5f477275bb181cee652a WatchSource:0}: Error finding container 6ea55292565299fc6b077dff53f76fc89ceb245b582b5f477275bb181cee652a: Status 404 returned error can't find the container with id 6ea55292565299fc6b077dff53f76fc89ceb245b582b5f477275bb181cee652a Feb 17 13:45:14 crc kubenswrapper[4804]: I0217 13:45:14.573375 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 13:45:14 crc kubenswrapper[4804]: I0217 13:45:14.887550 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 13:45:14 crc kubenswrapper[4804]: W0217 13:45:14.897814 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9eb8e8f_8bd1_4f69_84ee_27213046c709.slice/crio-918ec6cc8efbea653b3df239c3b4bee7a4be058ae44f44454a023eb003e3da78 WatchSource:0}: Error finding container 918ec6cc8efbea653b3df239c3b4bee7a4be058ae44f44454a023eb003e3da78: Status 404 returned error can't find the container with id 918ec6cc8efbea653b3df239c3b4bee7a4be058ae44f44454a023eb003e3da78 Feb 17 13:45:14 crc kubenswrapper[4804]: I0217 13:45:14.946120 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:45:14 crc kubenswrapper[4804]: I0217 13:45:14.953691 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.024060 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 13:45:15 crc kubenswrapper[4804]: W0217 13:45:15.030103 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10e1124a_f402_422d_a906_8d22c90d4abe.slice/crio-ff3764a92973d34694c59227480eef6b669b269368ac766f59b74883f0f1bee8 WatchSource:0}: Error finding container ff3764a92973d34694c59227480eef6b669b269368ac766f59b74883f0f1bee8: Status 404 returned error can't find the container with id ff3764a92973d34694c59227480eef6b669b269368ac766f59b74883f0f1bee8 Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.063618 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-dns-svc\") pod \"87f6a03c-039e-4107-985b-803f59ccfb89\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.063674 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-config\") pod \"87f6a03c-039e-4107-985b-803f59ccfb89\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.063716 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs5bs\" (UniqueName: \"kubernetes.io/projected/13452752-6880-43b4-9a63-8768d0afa122-kube-api-access-gs5bs\") pod \"13452752-6880-43b4-9a63-8768d0afa122\" (UID: \"13452752-6880-43b4-9a63-8768d0afa122\") " Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.063800 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13452752-6880-43b4-9a63-8768d0afa122-config\") pod \"13452752-6880-43b4-9a63-8768d0afa122\" (UID: \"13452752-6880-43b4-9a63-8768d0afa122\") " Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.063836 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxjdq\" (UniqueName: \"kubernetes.io/projected/87f6a03c-039e-4107-985b-803f59ccfb89-kube-api-access-lxjdq\") pod \"87f6a03c-039e-4107-985b-803f59ccfb89\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.064313 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87f6a03c-039e-4107-985b-803f59ccfb89" (UID: "87f6a03c-039e-4107-985b-803f59ccfb89"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.064358 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-config" (OuterVolumeSpecName: "config") pod "87f6a03c-039e-4107-985b-803f59ccfb89" (UID: "87f6a03c-039e-4107-985b-803f59ccfb89"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.064374 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13452752-6880-43b4-9a63-8768d0afa122-config" (OuterVolumeSpecName: "config") pod "13452752-6880-43b4-9a63-8768d0afa122" (UID: "13452752-6880-43b4-9a63-8768d0afa122"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.067656 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87f6a03c-039e-4107-985b-803f59ccfb89-kube-api-access-lxjdq" (OuterVolumeSpecName: "kube-api-access-lxjdq") pod "87f6a03c-039e-4107-985b-803f59ccfb89" (UID: "87f6a03c-039e-4107-985b-803f59ccfb89"). InnerVolumeSpecName "kube-api-access-lxjdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.067865 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13452752-6880-43b4-9a63-8768d0afa122-kube-api-access-gs5bs" (OuterVolumeSpecName: "kube-api-access-gs5bs") pod "13452752-6880-43b4-9a63-8768d0afa122" (UID: "13452752-6880-43b4-9a63-8768d0afa122"). InnerVolumeSpecName "kube-api-access-gs5bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.165907 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13452752-6880-43b4-9a63-8768d0afa122-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.165955 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxjdq\" (UniqueName: \"kubernetes.io/projected/87f6a03c-039e-4107-985b-803f59ccfb89-kube-api-access-lxjdq\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.165971 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.165983 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.165996 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs5bs\" (UniqueName: \"kubernetes.io/projected/13452752-6880-43b4-9a63-8768d0afa122-kube-api-access-gs5bs\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.475474 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" event={"ID":"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f","Type":"ContainerDied","Data":"96261c5dff8beaf5a66244a0c5555316f61e48042e355a630a22cedfabc69568"} Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.475299 4804 generic.go:334] "Generic (PLEG): container finished" podID="c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f" containerID="96261c5dff8beaf5a66244a0c5555316f61e48042e355a630a22cedfabc69568" exitCode=0 Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.480688 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.480733 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" event={"ID":"13452752-6880-43b4-9a63-8768d0afa122","Type":"ContainerDied","Data":"0a2ffab0e99d6480ecf94c911214dd4efd07af4dfd4133a32e88b8a9e531736b"} Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.483661 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cae6d84c-f65f-4ab2-a733-424ea34c680d","Type":"ContainerStarted","Data":"6af0a26e9132d4c61e6cb494719994825c6ff8368e85c8ef8c51fa4c2767ffd0"} Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.488282 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.488293 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" event={"ID":"87f6a03c-039e-4107-985b-803f59ccfb89","Type":"ContainerDied","Data":"94f72eaaad772aaa7f7438e818024e90c7cbfcba278a93aab7cafc50db2475ad"} Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.491245 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f9eb8e8f-8bd1-4f69-84ee-27213046c709","Type":"ContainerStarted","Data":"4dad789ec862994f5efea14d5772a174e7195a22623c58ea7121822318679542"} Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.491287 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f9eb8e8f-8bd1-4f69-84ee-27213046c709","Type":"ContainerStarted","Data":"918ec6cc8efbea653b3df239c3b4bee7a4be058ae44f44454a023eb003e3da78"} Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.494829 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"10e1124a-f402-422d-a906-8d22c90d4abe","Type":"ContainerStarted","Data":"ff3764a92973d34694c59227480eef6b669b269368ac766f59b74883f0f1bee8"} Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.501772 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad","Type":"ContainerStarted","Data":"de02dbb74f45601647c918b390d5f93cfff604870702fca3316aca846c6db162"} Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.504671 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rzcfd" event={"ID":"9c049787-03d2-4679-8705-ec2cd1ad8141","Type":"ContainerStarted","Data":"6ea55292565299fc6b077dff53f76fc89ceb245b582b5f477275bb181cee652a"} Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.569298 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c8wp7"] Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.579165 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c8wp7"] Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.696321 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rftlc"] Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.795559 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rftlc"] Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.924847 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.095398 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-p4wrm"] Feb 17 13:45:16 crc kubenswrapper[4804]: W0217 13:45:16.246731 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45330d20_989c_4507_ae57_5beaee075484.slice/crio-a92ca6162bb828356f877ee3020dc0b6e595dcb735158f701f58b2ba7a393d53 WatchSource:0}: Error finding container a92ca6162bb828356f877ee3020dc0b6e595dcb735158f701f58b2ba7a393d53: Status 404 returned error can't find the container with id a92ca6162bb828356f877ee3020dc0b6e595dcb735158f701f58b2ba7a393d53 Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.312281 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.507362 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-secret-volume\") pod \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.507474 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-config-volume\") pod \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.507579 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qflrd\" (UniqueName: \"kubernetes.io/projected/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-kube-api-access-qflrd\") pod \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.508959 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-config-volume" (OuterVolumeSpecName: "config-volume") pod "c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f" (UID: "c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.515776 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f" (UID: "c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.517378 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p4wrm" event={"ID":"45330d20-989c-4507-ae57-5beaee075484","Type":"ContainerStarted","Data":"a92ca6162bb828356f877ee3020dc0b6e595dcb735158f701f58b2ba7a393d53"} Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.518664 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0fc5c8da-b323-4afb-aa47-125fc63caefd","Type":"ContainerStarted","Data":"d6d5305f6f8a461927703285f13aa4b342733e9de1167ba86afdd469ef338742"} Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.521738 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.521857 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" event={"ID":"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f","Type":"ContainerDied","Data":"004599c68202e5bb23f471139692d2f94e213853179d88384fbc6aa468c034db"} Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.521876 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="004599c68202e5bb23f471139692d2f94e213853179d88384fbc6aa468c034db" Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.527420 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-kube-api-access-qflrd" (OuterVolumeSpecName: "kube-api-access-qflrd") pod "c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f" (UID: "c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f"). InnerVolumeSpecName "kube-api-access-qflrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.593221 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13452752-6880-43b4-9a63-8768d0afa122" path="/var/lib/kubelet/pods/13452752-6880-43b4-9a63-8768d0afa122/volumes" Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.593920 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87f6a03c-039e-4107-985b-803f59ccfb89" path="/var/lib/kubelet/pods/87f6a03c-039e-4107-985b-803f59ccfb89/volumes" Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.609423 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qflrd\" (UniqueName: \"kubernetes.io/projected/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-kube-api-access-qflrd\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.609469 4804 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.609482 4804 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:18 crc kubenswrapper[4804]: I0217 13:45:18.536660 4804 generic.go:334] "Generic (PLEG): container finished" podID="49b02c8f-ff07-48f9-8012-e78dc6591499" containerID="b93a15f86d51cc28e40802669fc1dc0ee030c02a56e4690a974969a6a5e38c99" exitCode=0 Feb 17 13:45:18 crc kubenswrapper[4804]: I0217 13:45:18.536781 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"49b02c8f-ff07-48f9-8012-e78dc6591499","Type":"ContainerDied","Data":"b93a15f86d51cc28e40802669fc1dc0ee030c02a56e4690a974969a6a5e38c99"} Feb 17 13:45:19 crc kubenswrapper[4804]: I0217 13:45:19.546169 4804 generic.go:334] "Generic (PLEG): container finished" podID="f9eb8e8f-8bd1-4f69-84ee-27213046c709" containerID="4dad789ec862994f5efea14d5772a174e7195a22623c58ea7121822318679542" exitCode=0 Feb 17 13:45:19 crc kubenswrapper[4804]: I0217 13:45:19.546254 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f9eb8e8f-8bd1-4f69-84ee-27213046c709","Type":"ContainerDied","Data":"4dad789ec862994f5efea14d5772a174e7195a22623c58ea7121822318679542"} Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.603954 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"10e1124a-f402-422d-a906-8d22c90d4abe","Type":"ContainerStarted","Data":"a5c4a5ce7132a270b2b5975f3fb68551b963040bcebfde546b06f8fa1f907bb6"} Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.606478 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f5ef96d0-19a6-4561-bde2-cf38e0280b39","Type":"ContainerStarted","Data":"48c3b4c65a16ba5ebe3448b8348b8660299955c67115f87eae6f949edef29da2"} Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.606644 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.608221 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0fc5c8da-b323-4afb-aa47-125fc63caefd","Type":"ContainerStarted","Data":"a19ba278c8655bc8d0ada49c717a14ce1984144439dfb2b337adf3b24c18dd11"} Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.612047 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rzcfd" event={"ID":"9c049787-03d2-4679-8705-ec2cd1ad8141","Type":"ContainerStarted","Data":"708395c123e89d382895495e94d97e1a95dc8f67a8cb757f1413c13804265e38"} Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.612868 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-rzcfd" Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.614514 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cae6d84c-f65f-4ab2-a733-424ea34c680d","Type":"ContainerStarted","Data":"5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0"} Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.614572 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.617495 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f9eb8e8f-8bd1-4f69-84ee-27213046c709","Type":"ContainerStarted","Data":"39996e72e1656146c1fd21d8c62c9541f69f078226387f63ab48de9f15249bc6"} Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.619380 4804 generic.go:334] "Generic (PLEG): container finished" podID="45330d20-989c-4507-ae57-5beaee075484" containerID="217e77a7b262a2ea58a9d14b86a7ed1f48d810f819a1d8df9ec003eb84b66ae4" exitCode=0 Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.619471 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p4wrm" event={"ID":"45330d20-989c-4507-ae57-5beaee075484","Type":"ContainerDied","Data":"217e77a7b262a2ea58a9d14b86a7ed1f48d810f819a1d8df9ec003eb84b66ae4"} Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.622389 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"49b02c8f-ff07-48f9-8012-e78dc6591499","Type":"ContainerStarted","Data":"d0a7f9b158d9b783df1a932e85ef37a9acfdf64570dc9811c07b05b77f46bfbf"} Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.635845 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=23.526031984 podStartE2EDuration="28.635819816s" podCreationTimestamp="2026-02-17 13:44:53 +0000 UTC" firstStartedPulling="2026-02-17 13:45:14.35089735 +0000 UTC m=+1188.462316697" lastFinishedPulling="2026-02-17 13:45:19.460685192 +0000 UTC m=+1193.572104529" observedRunningTime="2026-02-17 13:45:21.631823951 +0000 UTC m=+1195.743243288" watchObservedRunningTime="2026-02-17 13:45:21.635819816 +0000 UTC m=+1195.747239153" Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.657056 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-rzcfd" podStartSLOduration=16.943692872 podStartE2EDuration="22.657037191s" podCreationTimestamp="2026-02-17 13:44:59 +0000 UTC" firstStartedPulling="2026-02-17 13:45:14.56786633 +0000 UTC m=+1188.679285667" lastFinishedPulling="2026-02-17 13:45:20.281210649 +0000 UTC m=+1194.392629986" observedRunningTime="2026-02-17 13:45:21.654997887 +0000 UTC m=+1195.766417224" watchObservedRunningTime="2026-02-17 13:45:21.657037191 +0000 UTC m=+1195.768456528" Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.682502 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=17.075520288 podStartE2EDuration="30.682476739s" podCreationTimestamp="2026-02-17 13:44:51 +0000 UTC" firstStartedPulling="2026-02-17 13:45:00.43640937 +0000 UTC m=+1174.547828707" lastFinishedPulling="2026-02-17 13:45:14.043365821 +0000 UTC m=+1188.154785158" observedRunningTime="2026-02-17 13:45:21.677155602 +0000 UTC m=+1195.788574939" watchObservedRunningTime="2026-02-17 13:45:21.682476739 +0000 UTC m=+1195.793896096" Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.707120 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=29.70709537 podStartE2EDuration="29.70709537s" podCreationTimestamp="2026-02-17 13:44:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:45:21.701894467 +0000 UTC m=+1195.813313884" watchObservedRunningTime="2026-02-17 13:45:21.70709537 +0000 UTC m=+1195.818514707" Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.770799 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=19.992601054 podStartE2EDuration="25.770767865s" podCreationTimestamp="2026-02-17 13:44:56 +0000 UTC" firstStartedPulling="2026-02-17 13:45:14.599471341 +0000 UTC m=+1188.710890678" lastFinishedPulling="2026-02-17 13:45:20.377638152 +0000 UTC m=+1194.489057489" observedRunningTime="2026-02-17 13:45:21.739074582 +0000 UTC m=+1195.850493919" watchObservedRunningTime="2026-02-17 13:45:21.770767865 +0000 UTC m=+1195.882187232" Feb 17 13:45:22 crc kubenswrapper[4804]: I0217 13:45:22.533347 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 17 13:45:22 crc kubenswrapper[4804]: I0217 13:45:22.534114 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 17 13:45:22 crc kubenswrapper[4804]: I0217 13:45:22.631782 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p4wrm" event={"ID":"45330d20-989c-4507-ae57-5beaee075484","Type":"ContainerStarted","Data":"b657c439bdf2c279c1796f02b03ef98f0ccd6b8f5f26b36d733aa14612d348ec"} Feb 17 13:45:22 crc kubenswrapper[4804]: I0217 13:45:22.633162 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"10e1124a-f402-422d-a906-8d22c90d4abe","Type":"ContainerStarted","Data":"903c8a51b1a4cf89113253fd0c5b969fd2f642adf54120a08aabbe7c263b3f27"} Feb 17 13:45:22 crc kubenswrapper[4804]: I0217 13:45:22.639662 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0fc5c8da-b323-4afb-aa47-125fc63caefd","Type":"ContainerStarted","Data":"484e55da3fe17a14782d7443a2dbf3691be841c5bbaf448fe334e0feba2b7a89"} Feb 17 13:45:22 crc kubenswrapper[4804]: I0217 13:45:22.658403 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=13.489472895 podStartE2EDuration="20.658379065s" podCreationTimestamp="2026-02-17 13:45:02 +0000 UTC" firstStartedPulling="2026-02-17 13:45:15.033456273 +0000 UTC m=+1189.144875620" lastFinishedPulling="2026-02-17 13:45:22.202362433 +0000 UTC m=+1196.313781790" observedRunningTime="2026-02-17 13:45:22.64960622 +0000 UTC m=+1196.761025557" watchObservedRunningTime="2026-02-17 13:45:22.658379065 +0000 UTC m=+1196.769798402" Feb 17 13:45:22 crc kubenswrapper[4804]: I0217 13:45:22.680654 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=18.481921791 podStartE2EDuration="24.680634373s" podCreationTimestamp="2026-02-17 13:44:58 +0000 UTC" firstStartedPulling="2026-02-17 13:45:15.987878587 +0000 UTC m=+1190.099297924" lastFinishedPulling="2026-02-17 13:45:22.186591149 +0000 UTC m=+1196.298010506" observedRunningTime="2026-02-17 13:45:22.672124146 +0000 UTC m=+1196.783543493" watchObservedRunningTime="2026-02-17 13:45:22.680634373 +0000 UTC m=+1196.792053710" Feb 17 13:45:23 crc kubenswrapper[4804]: I0217 13:45:23.650481 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7705a06d-bc27-4686-9ca4-4aae248ead07","Type":"ContainerStarted","Data":"762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993"} Feb 17 13:45:23 crc kubenswrapper[4804]: I0217 13:45:23.653020 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p4wrm" event={"ID":"45330d20-989c-4507-ae57-5beaee075484","Type":"ContainerStarted","Data":"5ede0db05c38355b5ea63ac0452cc946d69281f7b2bd7401f4770c9f3a1bf045"} Feb 17 13:45:23 crc kubenswrapper[4804]: I0217 13:45:23.653528 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:45:23 crc kubenswrapper[4804]: I0217 13:45:23.653551 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:45:23 crc kubenswrapper[4804]: I0217 13:45:23.706404 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-p4wrm" podStartSLOduration=20.674907496 podStartE2EDuration="24.706382692s" podCreationTimestamp="2026-02-17 13:44:59 +0000 UTC" firstStartedPulling="2026-02-17 13:45:16.250498248 +0000 UTC m=+1190.361917585" lastFinishedPulling="2026-02-17 13:45:20.281973444 +0000 UTC m=+1194.393392781" observedRunningTime="2026-02-17 13:45:23.699493496 +0000 UTC m=+1197.810912843" watchObservedRunningTime="2026-02-17 13:45:23.706382692 +0000 UTC m=+1197.817802039" Feb 17 13:45:23 crc kubenswrapper[4804]: I0217 13:45:23.897661 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 17 13:45:23 crc kubenswrapper[4804]: I0217 13:45:23.944595 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 17 13:45:23 crc kubenswrapper[4804]: I0217 13:45:23.953651 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:24 crc kubenswrapper[4804]: I0217 13:45:24.038422 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 17 13:45:24 crc kubenswrapper[4804]: I0217 13:45:24.039265 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 17 13:45:24 crc kubenswrapper[4804]: I0217 13:45:24.664449 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 17 13:45:24 crc kubenswrapper[4804]: I0217 13:45:24.954364 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:25 crc kubenswrapper[4804]: I0217 13:45:25.004954 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:25 crc kubenswrapper[4804]: I0217 13:45:25.714012 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:25 crc kubenswrapper[4804]: I0217 13:45:25.715907 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 17 13:45:25 crc kubenswrapper[4804]: I0217 13:45:25.835161 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:45:25 crc kubenswrapper[4804]: I0217 13:45:25.835551 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.023406 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kqvs6"] Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.102233 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-4s7l5"] Feb 17 13:45:26 crc kubenswrapper[4804]: E0217 13:45:26.102530 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f" containerName="collect-profiles" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.102544 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f" containerName="collect-profiles" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.102700 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f" containerName="collect-profiles" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.103215 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.106767 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.113149 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-c69gn"] Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.114626 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.119653 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.130919 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-c69gn"] Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.139131 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4s7l5"] Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.193814 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csrsv\" (UniqueName: \"kubernetes.io/projected/50142f30-df04-4aa7-85e1-e303286966b7-kube-api-access-csrsv\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.193886 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d286aa08-b0df-44e8-9128-f596f4b44db8-combined-ca-bundle\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.193925 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.193951 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-config\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.193978 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d286aa08-b0df-44e8-9128-f596f4b44db8-ovn-rundir\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.193999 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d286aa08-b0df-44e8-9128-f596f4b44db8-ovs-rundir\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.194039 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d286aa08-b0df-44e8-9128-f596f4b44db8-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.194082 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d286aa08-b0df-44e8-9128-f596f4b44db8-config\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.194124 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.194233 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddfp4\" (UniqueName: \"kubernetes.io/projected/d286aa08-b0df-44e8-9128-f596f4b44db8-kube-api-access-ddfp4\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.220622 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vrzhp"] Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.248306 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.254779 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.264042 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.264281 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.271058 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-c6spm" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.271749 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.292127 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.295102 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csrsv\" (UniqueName: \"kubernetes.io/projected/50142f30-df04-4aa7-85e1-e303286966b7-kube-api-access-csrsv\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.295141 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d286aa08-b0df-44e8-9128-f596f4b44db8-combined-ca-bundle\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.295165 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.295182 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-config\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.295225 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d286aa08-b0df-44e8-9128-f596f4b44db8-ovn-rundir\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.295243 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d286aa08-b0df-44e8-9128-f596f4b44db8-ovs-rundir\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.295271 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d286aa08-b0df-44e8-9128-f596f4b44db8-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.296980 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-config\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.297983 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d286aa08-b0df-44e8-9128-f596f4b44db8-config\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.298030 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.298115 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddfp4\" (UniqueName: \"kubernetes.io/projected/d286aa08-b0df-44e8-9128-f596f4b44db8-kube-api-access-ddfp4\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.314722 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d286aa08-b0df-44e8-9128-f596f4b44db8-ovn-rundir\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.316838 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d286aa08-b0df-44e8-9128-f596f4b44db8-ovs-rundir\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.340357 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.341269 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.350087 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-s5nsf"] Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.350159 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d286aa08-b0df-44e8-9128-f596f4b44db8-combined-ca-bundle\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.350543 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d286aa08-b0df-44e8-9128-f596f4b44db8-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.350881 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d286aa08-b0df-44e8-9128-f596f4b44db8-config\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.351782 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.357868 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddfp4\" (UniqueName: \"kubernetes.io/projected/d286aa08-b0df-44e8-9128-f596f4b44db8-kube-api-access-ddfp4\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.366606 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.383030 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csrsv\" (UniqueName: \"kubernetes.io/projected/50142f30-df04-4aa7-85e1-e303286966b7-kube-api-access-csrsv\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.403833 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f72mb\" (UniqueName: \"kubernetes.io/projected/3e322ccb-33cf-466f-91fb-63781bdcffb6-kube-api-access-f72mb\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.403920 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e322ccb-33cf-466f-91fb-63781bdcffb6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.404000 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e322ccb-33cf-466f-91fb-63781bdcffb6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.404031 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e322ccb-33cf-466f-91fb-63781bdcffb6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.404141 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e322ccb-33cf-466f-91fb-63781bdcffb6-scripts\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.404160 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e322ccb-33cf-466f-91fb-63781bdcffb6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.404316 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e322ccb-33cf-466f-91fb-63781bdcffb6-config\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.430269 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-s5nsf"] Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.460279 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.478068 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.506676 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e322ccb-33cf-466f-91fb-63781bdcffb6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.506757 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e322ccb-33cf-466f-91fb-63781bdcffb6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.506790 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e322ccb-33cf-466f-91fb-63781bdcffb6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.506835 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-config\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.506874 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e322ccb-33cf-466f-91fb-63781bdcffb6-scripts\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.506903 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e322ccb-33cf-466f-91fb-63781bdcffb6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.506944 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7ndx\" (UniqueName: \"kubernetes.io/projected/f2e35955-0967-4a9c-b4e5-68316c98d58f-kube-api-access-n7ndx\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.506977 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.507063 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e322ccb-33cf-466f-91fb-63781bdcffb6-config\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.507106 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-dns-svc\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.507154 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.507186 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f72mb\" (UniqueName: \"kubernetes.io/projected/3e322ccb-33cf-466f-91fb-63781bdcffb6-kube-api-access-f72mb\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.511808 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e322ccb-33cf-466f-91fb-63781bdcffb6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.512815 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e322ccb-33cf-466f-91fb-63781bdcffb6-scripts\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.513604 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e322ccb-33cf-466f-91fb-63781bdcffb6-config\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.514525 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e322ccb-33cf-466f-91fb-63781bdcffb6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.526380 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.530506 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e322ccb-33cf-466f-91fb-63781bdcffb6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.542136 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e322ccb-33cf-466f-91fb-63781bdcffb6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.561031 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f72mb\" (UniqueName: \"kubernetes.io/projected/3e322ccb-33cf-466f-91fb-63781bdcffb6-kube-api-access-f72mb\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.609013 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-dns-svc\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.609083 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.609141 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-config\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.609176 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7ndx\" (UniqueName: \"kubernetes.io/projected/f2e35955-0967-4a9c-b4e5-68316c98d58f-kube-api-access-n7ndx\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.609218 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.612882 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.613053 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-dns-svc\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.613271 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.613399 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-config\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.665386 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-c6spm" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.674120 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7ndx\" (UniqueName: \"kubernetes.io/projected/f2e35955-0967-4a9c-b4e5-68316c98d58f-kube-api-access-n7ndx\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.674669 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.679707 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.784555 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.872149 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.917473 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-config\") pod \"3586301a-dce2-427b-b5c4-9376e59fbf27\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.917541 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zx6l\" (UniqueName: \"kubernetes.io/projected/3586301a-dce2-427b-b5c4-9376e59fbf27-kube-api-access-9zx6l\") pod \"3586301a-dce2-427b-b5c4-9376e59fbf27\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.917614 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-dns-svc\") pod \"3586301a-dce2-427b-b5c4-9376e59fbf27\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.919063 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-config" (OuterVolumeSpecName: "config") pod "3586301a-dce2-427b-b5c4-9376e59fbf27" (UID: "3586301a-dce2-427b-b5c4-9376e59fbf27"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.920440 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3586301a-dce2-427b-b5c4-9376e59fbf27" (UID: "3586301a-dce2-427b-b5c4-9376e59fbf27"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.938532 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3586301a-dce2-427b-b5c4-9376e59fbf27-kube-api-access-9zx6l" (OuterVolumeSpecName: "kube-api-access-9zx6l") pod "3586301a-dce2-427b-b5c4-9376e59fbf27" (UID: "3586301a-dce2-427b-b5c4-9376e59fbf27"). InnerVolumeSpecName "kube-api-access-9zx6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.987565 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.020051 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.020091 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zx6l\" (UniqueName: \"kubernetes.io/projected/3586301a-dce2-427b-b5c4-9376e59fbf27-kube-api-access-9zx6l\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.020105 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.121740 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv9pl\" (UniqueName: \"kubernetes.io/projected/8175f453-b68b-4236-844d-ff723515fe63-kube-api-access-kv9pl\") pod \"8175f453-b68b-4236-844d-ff723515fe63\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.122352 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-dns-svc\") pod \"8175f453-b68b-4236-844d-ff723515fe63\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.122545 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-config\") pod \"8175f453-b68b-4236-844d-ff723515fe63\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.123869 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-config" (OuterVolumeSpecName: "config") pod "8175f453-b68b-4236-844d-ff723515fe63" (UID: "8175f453-b68b-4236-844d-ff723515fe63"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.124571 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8175f453-b68b-4236-844d-ff723515fe63" (UID: "8175f453-b68b-4236-844d-ff723515fe63"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.127423 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8175f453-b68b-4236-844d-ff723515fe63-kube-api-access-kv9pl" (OuterVolumeSpecName: "kube-api-access-kv9pl") pod "8175f453-b68b-4236-844d-ff723515fe63" (UID: "8175f453-b68b-4236-844d-ff723515fe63"). InnerVolumeSpecName "kube-api-access-kv9pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.229226 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.229280 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv9pl\" (UniqueName: \"kubernetes.io/projected/8175f453-b68b-4236-844d-ff723515fe63-kube-api-access-kv9pl\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.229295 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.233336 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4s7l5"] Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.241182 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-c69gn"] Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.391305 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 17 13:45:27 crc kubenswrapper[4804]: W0217 13:45:27.408532 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e322ccb_33cf_466f_91fb_63781bdcffb6.slice/crio-26c9b6efe4276a413a683c57c2631ff7e090f2ebdf021b74ccfc04cc96b357ad WatchSource:0}: Error finding container 26c9b6efe4276a413a683c57c2631ff7e090f2ebdf021b74ccfc04cc96b357ad: Status 404 returned error can't find the container with id 26c9b6efe4276a413a683c57c2631ff7e090f2ebdf021b74ccfc04cc96b357ad Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.445012 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-s5nsf"] Feb 17 13:45:27 crc kubenswrapper[4804]: W0217 13:45:27.448061 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2e35955_0967_4a9c_b4e5_68316c98d58f.slice/crio-2fd594aa91077237ced828ad76ea96d9e73ca61204c2b63332a894fe7e26b921 WatchSource:0}: Error finding container 2fd594aa91077237ced828ad76ea96d9e73ca61204c2b63332a894fe7e26b921: Status 404 returned error can't find the container with id 2fd594aa91077237ced828ad76ea96d9e73ca61204c2b63332a894fe7e26b921 Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.730022 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-s5nsf" event={"ID":"f2e35955-0967-4a9c-b4e5-68316c98d58f","Type":"ContainerStarted","Data":"2fd594aa91077237ced828ad76ea96d9e73ca61204c2b63332a894fe7e26b921"} Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.734853 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4s7l5" event={"ID":"d286aa08-b0df-44e8-9128-f596f4b44db8","Type":"ContainerStarted","Data":"5a253bb1e30f5407483358daadb8d300069de0d9a15599231de8d80c785f568b"} Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.734925 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4s7l5" event={"ID":"d286aa08-b0df-44e8-9128-f596f4b44db8","Type":"ContainerStarted","Data":"421745ebbe6511f48fac6a70851b5f11760a42f7a8de727170ebbb740043a13e"} Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.734941 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" event={"ID":"3586301a-dce2-427b-b5c4-9376e59fbf27","Type":"ContainerDied","Data":"b6acb0860f5dd58b1333ac392aa371b170675172cb3eb7dbaaabc60cbdae0d1e"} Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.734962 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" event={"ID":"8175f453-b68b-4236-844d-ff723515fe63","Type":"ContainerDied","Data":"6dec93dab248c776ff8091a9233f8da9e53443d47dfd060ebb89371b1dc81611"} Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.736096 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" event={"ID":"50142f30-df04-4aa7-85e1-e303286966b7","Type":"ContainerStarted","Data":"beea4edb9ab8a4290234d33096c41b7ec1bcf83e4fedb843a0fee43bc42ec3a0"} Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.739484 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.741268 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3e322ccb-33cf-466f-91fb-63781bdcffb6","Type":"ContainerStarted","Data":"26c9b6efe4276a413a683c57c2631ff7e090f2ebdf021b74ccfc04cc96b357ad"} Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.741370 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.768127 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-4s7l5" podStartSLOduration=1.768098736 podStartE2EDuration="1.768098736s" podCreationTimestamp="2026-02-17 13:45:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:45:27.759282639 +0000 UTC m=+1201.870701976" watchObservedRunningTime="2026-02-17 13:45:27.768098736 +0000 UTC m=+1201.879518073" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.880883 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kqvs6"] Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.886777 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kqvs6"] Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.917532 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vrzhp"] Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.927452 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vrzhp"] Feb 17 13:45:28 crc kubenswrapper[4804]: I0217 13:45:28.195460 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 17 13:45:28 crc kubenswrapper[4804]: I0217 13:45:28.276437 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 17 13:45:28 crc kubenswrapper[4804]: I0217 13:45:28.585643 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3586301a-dce2-427b-b5c4-9376e59fbf27" path="/var/lib/kubelet/pods/3586301a-dce2-427b-b5c4-9376e59fbf27/volumes" Feb 17 13:45:28 crc kubenswrapper[4804]: I0217 13:45:28.586769 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8175f453-b68b-4236-844d-ff723515fe63" path="/var/lib/kubelet/pods/8175f453-b68b-4236-844d-ff723515fe63/volumes" Feb 17 13:45:28 crc kubenswrapper[4804]: I0217 13:45:28.751051 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3e322ccb-33cf-466f-91fb-63781bdcffb6","Type":"ContainerStarted","Data":"2ea9196e59af7aa250cd15085ebe409c06fd1598d66d60e3f77b06e040e8e13b"} Feb 17 13:45:28 crc kubenswrapper[4804]: I0217 13:45:28.753139 4804 generic.go:334] "Generic (PLEG): container finished" podID="f2e35955-0967-4a9c-b4e5-68316c98d58f" containerID="84e8ddfc4492066363704b53f96c7ec893c142c41860e555ddf4bd60f09fd51e" exitCode=0 Feb 17 13:45:28 crc kubenswrapper[4804]: I0217 13:45:28.753271 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-s5nsf" event={"ID":"f2e35955-0967-4a9c-b4e5-68316c98d58f","Type":"ContainerDied","Data":"84e8ddfc4492066363704b53f96c7ec893c142c41860e555ddf4bd60f09fd51e"} Feb 17 13:45:28 crc kubenswrapper[4804]: I0217 13:45:28.754822 4804 generic.go:334] "Generic (PLEG): container finished" podID="50142f30-df04-4aa7-85e1-e303286966b7" containerID="ab28572c13b5040bf4ce2b36ef6ec484d61c5f798e505ab5c93281f67d3def85" exitCode=0 Feb 17 13:45:28 crc kubenswrapper[4804]: I0217 13:45:28.754932 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" event={"ID":"50142f30-df04-4aa7-85e1-e303286966b7","Type":"ContainerDied","Data":"ab28572c13b5040bf4ce2b36ef6ec484d61c5f798e505ab5c93281f67d3def85"} Feb 17 13:45:29 crc kubenswrapper[4804]: I0217 13:45:29.082073 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 17 13:45:29 crc kubenswrapper[4804]: I0217 13:45:29.774807 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3e322ccb-33cf-466f-91fb-63781bdcffb6","Type":"ContainerStarted","Data":"59c5735b217f50dec368f226d655e3f5011a449fe03afeb3ec36351d968b71bf"} Feb 17 13:45:29 crc kubenswrapper[4804]: I0217 13:45:29.775172 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 17 13:45:29 crc kubenswrapper[4804]: I0217 13:45:29.779085 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-s5nsf" event={"ID":"f2e35955-0967-4a9c-b4e5-68316c98d58f","Type":"ContainerStarted","Data":"4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2"} Feb 17 13:45:29 crc kubenswrapper[4804]: I0217 13:45:29.779757 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:29 crc kubenswrapper[4804]: I0217 13:45:29.785830 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" event={"ID":"50142f30-df04-4aa7-85e1-e303286966b7","Type":"ContainerStarted","Data":"e01cd3f75d3aef51eac8d30928a4014bf5777a2284faed2dc4b55788bbefaed0"} Feb 17 13:45:29 crc kubenswrapper[4804]: I0217 13:45:29.786763 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:29 crc kubenswrapper[4804]: I0217 13:45:29.804779 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.760325873 podStartE2EDuration="3.804757638s" podCreationTimestamp="2026-02-17 13:45:26 +0000 UTC" firstStartedPulling="2026-02-17 13:45:27.411962273 +0000 UTC m=+1201.523381610" lastFinishedPulling="2026-02-17 13:45:28.456394048 +0000 UTC m=+1202.567813375" observedRunningTime="2026-02-17 13:45:29.801620921 +0000 UTC m=+1203.913040268" watchObservedRunningTime="2026-02-17 13:45:29.804757638 +0000 UTC m=+1203.916176975" Feb 17 13:45:29 crc kubenswrapper[4804]: I0217 13:45:29.819990 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" podStartSLOduration=3.384035783 podStartE2EDuration="3.819966746s" podCreationTimestamp="2026-02-17 13:45:26 +0000 UTC" firstStartedPulling="2026-02-17 13:45:27.249940955 +0000 UTC m=+1201.361360292" lastFinishedPulling="2026-02-17 13:45:27.685871918 +0000 UTC m=+1201.797291255" observedRunningTime="2026-02-17 13:45:29.817356344 +0000 UTC m=+1203.928775711" watchObservedRunningTime="2026-02-17 13:45:29.819966746 +0000 UTC m=+1203.931386083" Feb 17 13:45:29 crc kubenswrapper[4804]: I0217 13:45:29.841514 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-s5nsf" podStartSLOduration=3.418349279 podStartE2EDuration="3.84148699s" podCreationTimestamp="2026-02-17 13:45:26 +0000 UTC" firstStartedPulling="2026-02-17 13:45:27.452608748 +0000 UTC m=+1201.564028085" lastFinishedPulling="2026-02-17 13:45:27.875746459 +0000 UTC m=+1201.987165796" observedRunningTime="2026-02-17 13:45:29.834083628 +0000 UTC m=+1203.945502985" watchObservedRunningTime="2026-02-17 13:45:29.84148699 +0000 UTC m=+1203.952906327" Feb 17 13:45:30 crc kubenswrapper[4804]: I0217 13:45:30.705532 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 17 13:45:30 crc kubenswrapper[4804]: I0217 13:45:30.802895 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 17 13:45:31 crc kubenswrapper[4804]: I0217 13:45:31.284258 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-hbdkd"] Feb 17 13:45:31 crc kubenswrapper[4804]: I0217 13:45:31.285755 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hbdkd" Feb 17 13:45:31 crc kubenswrapper[4804]: I0217 13:45:31.290759 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hbdkd"] Feb 17 13:45:31 crc kubenswrapper[4804]: I0217 13:45:31.294925 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 17 13:45:31 crc kubenswrapper[4804]: I0217 13:45:31.313107 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fb7k\" (UniqueName: \"kubernetes.io/projected/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-kube-api-access-5fb7k\") pod \"root-account-create-update-hbdkd\" (UID: \"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e\") " pod="openstack/root-account-create-update-hbdkd" Feb 17 13:45:31 crc kubenswrapper[4804]: I0217 13:45:31.313165 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-operator-scripts\") pod \"root-account-create-update-hbdkd\" (UID: \"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e\") " pod="openstack/root-account-create-update-hbdkd" Feb 17 13:45:31 crc kubenswrapper[4804]: I0217 13:45:31.414283 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fb7k\" (UniqueName: \"kubernetes.io/projected/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-kube-api-access-5fb7k\") pod \"root-account-create-update-hbdkd\" (UID: \"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e\") " pod="openstack/root-account-create-update-hbdkd" Feb 17 13:45:31 crc kubenswrapper[4804]: I0217 13:45:31.414349 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-operator-scripts\") pod \"root-account-create-update-hbdkd\" (UID: \"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e\") " pod="openstack/root-account-create-update-hbdkd" Feb 17 13:45:31 crc kubenswrapper[4804]: I0217 13:45:31.415142 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-operator-scripts\") pod \"root-account-create-update-hbdkd\" (UID: \"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e\") " pod="openstack/root-account-create-update-hbdkd" Feb 17 13:45:31 crc kubenswrapper[4804]: I0217 13:45:31.435739 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fb7k\" (UniqueName: \"kubernetes.io/projected/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-kube-api-access-5fb7k\") pod \"root-account-create-update-hbdkd\" (UID: \"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e\") " pod="openstack/root-account-create-update-hbdkd" Feb 17 13:45:31 crc kubenswrapper[4804]: I0217 13:45:31.615402 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hbdkd" Feb 17 13:45:32 crc kubenswrapper[4804]: I0217 13:45:32.117513 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hbdkd"] Feb 17 13:45:32 crc kubenswrapper[4804]: W0217 13:45:32.125461 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded593eeb_17c6_42cc_a392_9fbb1f3aef6e.slice/crio-0e526352fe86de53d110e5db802aba66020e9334cff6f240f845e8f5d42b814d WatchSource:0}: Error finding container 0e526352fe86de53d110e5db802aba66020e9334cff6f240f845e8f5d42b814d: Status 404 returned error can't find the container with id 0e526352fe86de53d110e5db802aba66020e9334cff6f240f845e8f5d42b814d Feb 17 13:45:32 crc kubenswrapper[4804]: I0217 13:45:32.810447 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hbdkd" event={"ID":"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e","Type":"ContainerStarted","Data":"0e526352fe86de53d110e5db802aba66020e9334cff6f240f845e8f5d42b814d"} Feb 17 13:45:34 crc kubenswrapper[4804]: I0217 13:45:34.833892 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hbdkd" event={"ID":"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e","Type":"ContainerStarted","Data":"b7047f4fb5cc51bf92eedf0304d4d9a035247692d44844e1d6c89de23d58aef4"} Feb 17 13:45:34 crc kubenswrapper[4804]: I0217 13:45:34.857410 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-hbdkd" podStartSLOduration=3.85738542 podStartE2EDuration="3.85738542s" podCreationTimestamp="2026-02-17 13:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:45:34.850474303 +0000 UTC m=+1208.961893660" watchObservedRunningTime="2026-02-17 13:45:34.85738542 +0000 UTC m=+1208.968804777" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.135530 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-dl5b9"] Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.136627 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dl5b9" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.165898 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dl5b9"] Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.190295 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qsxw\" (UniqueName: \"kubernetes.io/projected/4bc37bd5-6784-41f8-98de-ef6a43493cd6-kube-api-access-8qsxw\") pod \"keystone-db-create-dl5b9\" (UID: \"4bc37bd5-6784-41f8-98de-ef6a43493cd6\") " pod="openstack/keystone-db-create-dl5b9" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.190436 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bc37bd5-6784-41f8-98de-ef6a43493cd6-operator-scripts\") pod \"keystone-db-create-dl5b9\" (UID: \"4bc37bd5-6784-41f8-98de-ef6a43493cd6\") " pod="openstack/keystone-db-create-dl5b9" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.232612 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-886b-account-create-update-h84mx"] Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.233737 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-886b-account-create-update-h84mx" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.235692 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.242152 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-886b-account-create-update-h84mx"] Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.292397 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qsxw\" (UniqueName: \"kubernetes.io/projected/4bc37bd5-6784-41f8-98de-ef6a43493cd6-kube-api-access-8qsxw\") pod \"keystone-db-create-dl5b9\" (UID: \"4bc37bd5-6784-41f8-98de-ef6a43493cd6\") " pod="openstack/keystone-db-create-dl5b9" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.292459 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-operator-scripts\") pod \"keystone-886b-account-create-update-h84mx\" (UID: \"6f9dbe9b-ced6-453d-9f59-0d92e2a69043\") " pod="openstack/keystone-886b-account-create-update-h84mx" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.292497 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bc37bd5-6784-41f8-98de-ef6a43493cd6-operator-scripts\") pod \"keystone-db-create-dl5b9\" (UID: \"4bc37bd5-6784-41f8-98de-ef6a43493cd6\") " pod="openstack/keystone-db-create-dl5b9" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.292855 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mdgv\" (UniqueName: \"kubernetes.io/projected/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-kube-api-access-5mdgv\") pod \"keystone-886b-account-create-update-h84mx\" (UID: \"6f9dbe9b-ced6-453d-9f59-0d92e2a69043\") " pod="openstack/keystone-886b-account-create-update-h84mx" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.293245 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bc37bd5-6784-41f8-98de-ef6a43493cd6-operator-scripts\") pod \"keystone-db-create-dl5b9\" (UID: \"4bc37bd5-6784-41f8-98de-ef6a43493cd6\") " pod="openstack/keystone-db-create-dl5b9" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.311886 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qsxw\" (UniqueName: \"kubernetes.io/projected/4bc37bd5-6784-41f8-98de-ef6a43493cd6-kube-api-access-8qsxw\") pod \"keystone-db-create-dl5b9\" (UID: \"4bc37bd5-6784-41f8-98de-ef6a43493cd6\") " pod="openstack/keystone-db-create-dl5b9" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.384583 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-6m6pk"] Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.385983 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6m6pk" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.391774 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6m6pk"] Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.394893 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mdgv\" (UniqueName: \"kubernetes.io/projected/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-kube-api-access-5mdgv\") pod \"keystone-886b-account-create-update-h84mx\" (UID: \"6f9dbe9b-ced6-453d-9f59-0d92e2a69043\") " pod="openstack/keystone-886b-account-create-update-h84mx" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.394990 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-operator-scripts\") pod \"keystone-886b-account-create-update-h84mx\" (UID: \"6f9dbe9b-ced6-453d-9f59-0d92e2a69043\") " pod="openstack/keystone-886b-account-create-update-h84mx" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.396410 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-operator-scripts\") pod \"keystone-886b-account-create-update-h84mx\" (UID: \"6f9dbe9b-ced6-453d-9f59-0d92e2a69043\") " pod="openstack/keystone-886b-account-create-update-h84mx" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.419736 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mdgv\" (UniqueName: \"kubernetes.io/projected/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-kube-api-access-5mdgv\") pod \"keystone-886b-account-create-update-h84mx\" (UID: \"6f9dbe9b-ced6-453d-9f59-0d92e2a69043\") " pod="openstack/keystone-886b-account-create-update-h84mx" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.468851 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dl5b9" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.496294 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2edd89a7-0866-4677-8b25-9654130c6ac5-operator-scripts\") pod \"placement-db-create-6m6pk\" (UID: \"2edd89a7-0866-4677-8b25-9654130c6ac5\") " pod="openstack/placement-db-create-6m6pk" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.496472 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqldx\" (UniqueName: \"kubernetes.io/projected/2edd89a7-0866-4677-8b25-9654130c6ac5-kube-api-access-nqldx\") pod \"placement-db-create-6m6pk\" (UID: \"2edd89a7-0866-4677-8b25-9654130c6ac5\") " pod="openstack/placement-db-create-6m6pk" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.539057 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-0898-account-create-update-6vpd7"] Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.540457 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0898-account-create-update-6vpd7" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.542513 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.547378 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0898-account-create-update-6vpd7"] Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.548965 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-886b-account-create-update-h84mx" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.597709 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2edd89a7-0866-4677-8b25-9654130c6ac5-operator-scripts\") pod \"placement-db-create-6m6pk\" (UID: \"2edd89a7-0866-4677-8b25-9654130c6ac5\") " pod="openstack/placement-db-create-6m6pk" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.598100 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7e6539-c0c9-40e7-b076-38cc23f233cc-operator-scripts\") pod \"placement-0898-account-create-update-6vpd7\" (UID: \"ba7e6539-c0c9-40e7-b076-38cc23f233cc\") " pod="openstack/placement-0898-account-create-update-6vpd7" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.598241 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rvmk\" (UniqueName: \"kubernetes.io/projected/ba7e6539-c0c9-40e7-b076-38cc23f233cc-kube-api-access-6rvmk\") pod \"placement-0898-account-create-update-6vpd7\" (UID: \"ba7e6539-c0c9-40e7-b076-38cc23f233cc\") " pod="openstack/placement-0898-account-create-update-6vpd7" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.598307 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqldx\" (UniqueName: \"kubernetes.io/projected/2edd89a7-0866-4677-8b25-9654130c6ac5-kube-api-access-nqldx\") pod \"placement-db-create-6m6pk\" (UID: \"2edd89a7-0866-4677-8b25-9654130c6ac5\") " pod="openstack/placement-db-create-6m6pk" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.598663 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2edd89a7-0866-4677-8b25-9654130c6ac5-operator-scripts\") pod \"placement-db-create-6m6pk\" (UID: \"2edd89a7-0866-4677-8b25-9654130c6ac5\") " pod="openstack/placement-db-create-6m6pk" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.620972 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqldx\" (UniqueName: \"kubernetes.io/projected/2edd89a7-0866-4677-8b25-9654130c6ac5-kube-api-access-nqldx\") pod \"placement-db-create-6m6pk\" (UID: \"2edd89a7-0866-4677-8b25-9654130c6ac5\") " pod="openstack/placement-db-create-6m6pk" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.699514 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rvmk\" (UniqueName: \"kubernetes.io/projected/ba7e6539-c0c9-40e7-b076-38cc23f233cc-kube-api-access-6rvmk\") pod \"placement-0898-account-create-update-6vpd7\" (UID: \"ba7e6539-c0c9-40e7-b076-38cc23f233cc\") " pod="openstack/placement-0898-account-create-update-6vpd7" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.699691 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7e6539-c0c9-40e7-b076-38cc23f233cc-operator-scripts\") pod \"placement-0898-account-create-update-6vpd7\" (UID: \"ba7e6539-c0c9-40e7-b076-38cc23f233cc\") " pod="openstack/placement-0898-account-create-update-6vpd7" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.700728 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7e6539-c0c9-40e7-b076-38cc23f233cc-operator-scripts\") pod \"placement-0898-account-create-update-6vpd7\" (UID: \"ba7e6539-c0c9-40e7-b076-38cc23f233cc\") " pod="openstack/placement-0898-account-create-update-6vpd7" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.707661 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6m6pk" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.722171 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rvmk\" (UniqueName: \"kubernetes.io/projected/ba7e6539-c0c9-40e7-b076-38cc23f233cc-kube-api-access-6rvmk\") pod \"placement-0898-account-create-update-6vpd7\" (UID: \"ba7e6539-c0c9-40e7-b076-38cc23f233cc\") " pod="openstack/placement-0898-account-create-update-6vpd7" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.844963 4804 generic.go:334] "Generic (PLEG): container finished" podID="ed593eeb-17c6-42cc-a392-9fbb1f3aef6e" containerID="b7047f4fb5cc51bf92eedf0304d4d9a035247692d44844e1d6c89de23d58aef4" exitCode=0 Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.845027 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hbdkd" event={"ID":"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e","Type":"ContainerDied","Data":"b7047f4fb5cc51bf92eedf0304d4d9a035247692d44844e1d6c89de23d58aef4"} Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.935886 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0898-account-create-update-6vpd7" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.966934 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dl5b9"] Feb 17 13:45:35 crc kubenswrapper[4804]: W0217 13:45:35.976632 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bc37bd5_6784_41f8_98de_ef6a43493cd6.slice/crio-f20e76ded39c36574ceb45bd27901b8c625cfc377c6fbf0205827cf374bd32b0 WatchSource:0}: Error finding container f20e76ded39c36574ceb45bd27901b8c625cfc377c6fbf0205827cf374bd32b0: Status 404 returned error can't find the container with id f20e76ded39c36574ceb45bd27901b8c625cfc377c6fbf0205827cf374bd32b0 Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.064747 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-886b-account-create-update-h84mx"] Feb 17 13:45:36 crc kubenswrapper[4804]: W0217 13:45:36.077057 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f9dbe9b_ced6_453d_9f59_0d92e2a69043.slice/crio-f5bcec3b699cdf0f9f0db04a7fa76d46cf5931f896cb8a46357142efa96651b3 WatchSource:0}: Error finding container f5bcec3b699cdf0f9f0db04a7fa76d46cf5931f896cb8a46357142efa96651b3: Status 404 returned error can't find the container with id f5bcec3b699cdf0f9f0db04a7fa76d46cf5931f896cb8a46357142efa96651b3 Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.138559 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6m6pk"] Feb 17 13:45:36 crc kubenswrapper[4804]: W0217 13:45:36.156770 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2edd89a7_0866_4677_8b25_9654130c6ac5.slice/crio-e323ede546dfc16e7b70b5e9c8b973e450ff37d7035fb2de2ab1e2a23046c82c WatchSource:0}: Error finding container e323ede546dfc16e7b70b5e9c8b973e450ff37d7035fb2de2ab1e2a23046c82c: Status 404 returned error can't find the container with id e323ede546dfc16e7b70b5e9c8b973e450ff37d7035fb2de2ab1e2a23046c82c Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.421026 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0898-account-create-update-6vpd7"] Feb 17 13:45:36 crc kubenswrapper[4804]: W0217 13:45:36.428713 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba7e6539_c0c9_40e7_b076_38cc23f233cc.slice/crio-a83919020b9aabb5aba4df6b823708e2580f5074e6ede1893518c8eddc8797b1 WatchSource:0}: Error finding container a83919020b9aabb5aba4df6b823708e2580f5074e6ede1893518c8eddc8797b1: Status 404 returned error can't find the container with id a83919020b9aabb5aba4df6b823708e2580f5074e6ede1893518c8eddc8797b1 Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.432891 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-c69gn"] Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.433513 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" podUID="50142f30-df04-4aa7-85e1-e303286966b7" containerName="dnsmasq-dns" containerID="cri-o://e01cd3f75d3aef51eac8d30928a4014bf5777a2284faed2dc4b55788bbefaed0" gracePeriod=10 Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.435410 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.487095 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" podUID="50142f30-df04-4aa7-85e1-e303286966b7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.109:5353: connect: connection refused" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.491437 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mp4l9"] Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.493470 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.568412 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mp4l9"] Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.625672 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.625800 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfbbq\" (UniqueName: \"kubernetes.io/projected/86aca321-b4a3-4d89-ab34-5d311aa11fe9-kube-api-access-zfbbq\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.625859 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.625937 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-config\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.625980 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.727936 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.728386 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfbbq\" (UniqueName: \"kubernetes.io/projected/86aca321-b4a3-4d89-ab34-5d311aa11fe9-kube-api-access-zfbbq\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.728808 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.729093 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-config\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.729587 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.729712 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.732022 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.732769 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-config\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.733241 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.752008 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfbbq\" (UniqueName: \"kubernetes.io/projected/86aca321-b4a3-4d89-ab34-5d311aa11fe9-kube-api-access-zfbbq\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.858458 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0898-account-create-update-6vpd7" event={"ID":"ba7e6539-c0c9-40e7-b076-38cc23f233cc","Type":"ContainerStarted","Data":"f94b862fb364184a245162162ecd4a81dc390800d6adb7360015eea9da137ebf"} Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.858514 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0898-account-create-update-6vpd7" event={"ID":"ba7e6539-c0c9-40e7-b076-38cc23f233cc","Type":"ContainerStarted","Data":"a83919020b9aabb5aba4df6b823708e2580f5074e6ede1893518c8eddc8797b1"} Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.861298 4804 generic.go:334] "Generic (PLEG): container finished" podID="6f9dbe9b-ced6-453d-9f59-0d92e2a69043" containerID="df5f178d05ce64eb60f91663ba876543b059e11efed3814a687a5cde6c71f197" exitCode=0 Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.861416 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-886b-account-create-update-h84mx" event={"ID":"6f9dbe9b-ced6-453d-9f59-0d92e2a69043","Type":"ContainerDied","Data":"df5f178d05ce64eb60f91663ba876543b059e11efed3814a687a5cde6c71f197"} Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.861444 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-886b-account-create-update-h84mx" event={"ID":"6f9dbe9b-ced6-453d-9f59-0d92e2a69043","Type":"ContainerStarted","Data":"f5bcec3b699cdf0f9f0db04a7fa76d46cf5931f896cb8a46357142efa96651b3"} Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.862932 4804 generic.go:334] "Generic (PLEG): container finished" podID="4bc37bd5-6784-41f8-98de-ef6a43493cd6" containerID="e84b0f31988f4caf559aaf77b9c196ea5e660cca5bf9a529065d3d4f3f6186e1" exitCode=0 Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.862982 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dl5b9" event={"ID":"4bc37bd5-6784-41f8-98de-ef6a43493cd6","Type":"ContainerDied","Data":"e84b0f31988f4caf559aaf77b9c196ea5e660cca5bf9a529065d3d4f3f6186e1"} Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.862997 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dl5b9" event={"ID":"4bc37bd5-6784-41f8-98de-ef6a43493cd6","Type":"ContainerStarted","Data":"f20e76ded39c36574ceb45bd27901b8c625cfc377c6fbf0205827cf374bd32b0"} Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.870714 4804 generic.go:334] "Generic (PLEG): container finished" podID="50142f30-df04-4aa7-85e1-e303286966b7" containerID="e01cd3f75d3aef51eac8d30928a4014bf5777a2284faed2dc4b55788bbefaed0" exitCode=0 Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.870776 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" event={"ID":"50142f30-df04-4aa7-85e1-e303286966b7","Type":"ContainerDied","Data":"e01cd3f75d3aef51eac8d30928a4014bf5777a2284faed2dc4b55788bbefaed0"} Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.872900 4804 generic.go:334] "Generic (PLEG): container finished" podID="2edd89a7-0866-4677-8b25-9654130c6ac5" containerID="9b6aded40ee8715e414f7eaa0e4d2635fac772bb7db34b9cafa3737130656836" exitCode=0 Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.872958 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6m6pk" event={"ID":"2edd89a7-0866-4677-8b25-9654130c6ac5","Type":"ContainerDied","Data":"9b6aded40ee8715e414f7eaa0e4d2635fac772bb7db34b9cafa3737130656836"} Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.872978 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6m6pk" event={"ID":"2edd89a7-0866-4677-8b25-9654130c6ac5","Type":"ContainerStarted","Data":"e323ede546dfc16e7b70b5e9c8b973e450ff37d7035fb2de2ab1e2a23046c82c"} Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.874335 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.895130 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-0898-account-create-update-6vpd7" podStartSLOduration=1.895098746 podStartE2EDuration="1.895098746s" podCreationTimestamp="2026-02-17 13:45:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:45:36.875917615 +0000 UTC m=+1210.987336972" watchObservedRunningTime="2026-02-17 13:45:36.895098746 +0000 UTC m=+1211.006518103" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.014708 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.215254 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hbdkd" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.245785 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fb7k\" (UniqueName: \"kubernetes.io/projected/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-kube-api-access-5fb7k\") pod \"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e\" (UID: \"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e\") " Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.245930 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-operator-scripts\") pod \"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e\" (UID: \"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e\") " Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.246564 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed593eeb-17c6-42cc-a392-9fbb1f3aef6e" (UID: "ed593eeb-17c6-42cc-a392-9fbb1f3aef6e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.253316 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-kube-api-access-5fb7k" (OuterVolumeSpecName: "kube-api-access-5fb7k") pod "ed593eeb-17c6-42cc-a392-9fbb1f3aef6e" (UID: "ed593eeb-17c6-42cc-a392-9fbb1f3aef6e"). InnerVolumeSpecName "kube-api-access-5fb7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.347506 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fb7k\" (UniqueName: \"kubernetes.io/projected/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-kube-api-access-5fb7k\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.347557 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.487129 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.551005 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-config\") pod \"50142f30-df04-4aa7-85e1-e303286966b7\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.551108 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-dns-svc\") pod \"50142f30-df04-4aa7-85e1-e303286966b7\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.551209 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-ovsdbserver-sb\") pod \"50142f30-df04-4aa7-85e1-e303286966b7\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.551348 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csrsv\" (UniqueName: \"kubernetes.io/projected/50142f30-df04-4aa7-85e1-e303286966b7-kube-api-access-csrsv\") pod \"50142f30-df04-4aa7-85e1-e303286966b7\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.558389 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50142f30-df04-4aa7-85e1-e303286966b7-kube-api-access-csrsv" (OuterVolumeSpecName: "kube-api-access-csrsv") pod "50142f30-df04-4aa7-85e1-e303286966b7" (UID: "50142f30-df04-4aa7-85e1-e303286966b7"). InnerVolumeSpecName "kube-api-access-csrsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.595481 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-config" (OuterVolumeSpecName: "config") pod "50142f30-df04-4aa7-85e1-e303286966b7" (UID: "50142f30-df04-4aa7-85e1-e303286966b7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.596213 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "50142f30-df04-4aa7-85e1-e303286966b7" (UID: "50142f30-df04-4aa7-85e1-e303286966b7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.598521 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "50142f30-df04-4aa7-85e1-e303286966b7" (UID: "50142f30-df04-4aa7-85e1-e303286966b7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.598658 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mp4l9"] Feb 17 13:45:37 crc kubenswrapper[4804]: W0217 13:45:37.603063 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86aca321_b4a3_4d89_ab34_5d311aa11fe9.slice/crio-cc022082e5090f1a0915d5020212d3b1d395c728921a669b0ed6b89573f0530f WatchSource:0}: Error finding container cc022082e5090f1a0915d5020212d3b1d395c728921a669b0ed6b89573f0530f: Status 404 returned error can't find the container with id cc022082e5090f1a0915d5020212d3b1d395c728921a669b0ed6b89573f0530f Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.612738 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 17 13:45:37 crc kubenswrapper[4804]: E0217 13:45:37.613170 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50142f30-df04-4aa7-85e1-e303286966b7" containerName="init" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.613195 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="50142f30-df04-4aa7-85e1-e303286966b7" containerName="init" Feb 17 13:45:37 crc kubenswrapper[4804]: E0217 13:45:37.613231 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50142f30-df04-4aa7-85e1-e303286966b7" containerName="dnsmasq-dns" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.613239 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="50142f30-df04-4aa7-85e1-e303286966b7" containerName="dnsmasq-dns" Feb 17 13:45:37 crc kubenswrapper[4804]: E0217 13:45:37.613266 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed593eeb-17c6-42cc-a392-9fbb1f3aef6e" containerName="mariadb-account-create-update" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.613274 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed593eeb-17c6-42cc-a392-9fbb1f3aef6e" containerName="mariadb-account-create-update" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.613485 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="50142f30-df04-4aa7-85e1-e303286966b7" containerName="dnsmasq-dns" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.613521 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed593eeb-17c6-42cc-a392-9fbb1f3aef6e" containerName="mariadb-account-create-update" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.619932 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.622622 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.622645 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-dl8mk" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.622689 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.622700 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.637029 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.663854 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcfhb\" (UniqueName: \"kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-kube-api-access-vcfhb\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.663911 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.663975 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.664025 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-lock\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.664091 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-cache\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.664122 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.664238 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csrsv\" (UniqueName: \"kubernetes.io/projected/50142f30-df04-4aa7-85e1-e303286966b7-kube-api-access-csrsv\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.664253 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.664264 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.664275 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.765284 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-lock\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.765371 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-cache\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.765764 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.765778 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-cache\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.765917 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-lock\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.765952 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcfhb\" (UniqueName: \"kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-kube-api-access-vcfhb\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.766075 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.766153 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: E0217 13:45:37.766277 4804 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 13:45:37 crc kubenswrapper[4804]: E0217 13:45:37.766305 4804 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 13:45:37 crc kubenswrapper[4804]: E0217 13:45:37.766348 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift podName:90da6e89-6033-4e42-a5ca-bed1a5ad6a46 nodeName:}" failed. No retries permitted until 2026-02-17 13:45:38.266332822 +0000 UTC m=+1212.377752159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift") pod "swift-storage-0" (UID: "90da6e89-6033-4e42-a5ca-bed1a5ad6a46") : configmap "swift-ring-files" not found Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.766978 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.772730 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.783823 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcfhb\" (UniqueName: \"kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-kube-api-access-vcfhb\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.793831 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.890470 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" event={"ID":"50142f30-df04-4aa7-85e1-e303286966b7","Type":"ContainerDied","Data":"beea4edb9ab8a4290234d33096c41b7ec1bcf83e4fedb843a0fee43bc42ec3a0"} Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.890529 4804 scope.go:117] "RemoveContainer" containerID="e01cd3f75d3aef51eac8d30928a4014bf5777a2284faed2dc4b55788bbefaed0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.890650 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.894659 4804 generic.go:334] "Generic (PLEG): container finished" podID="86aca321-b4a3-4d89-ab34-5d311aa11fe9" containerID="2dffc4a386295a9c8c5efcda69222c6e5a062401a2d3d79656ce2b29324ac9c0" exitCode=0 Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.894766 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" event={"ID":"86aca321-b4a3-4d89-ab34-5d311aa11fe9","Type":"ContainerDied","Data":"2dffc4a386295a9c8c5efcda69222c6e5a062401a2d3d79656ce2b29324ac9c0"} Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.894793 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" event={"ID":"86aca321-b4a3-4d89-ab34-5d311aa11fe9","Type":"ContainerStarted","Data":"cc022082e5090f1a0915d5020212d3b1d395c728921a669b0ed6b89573f0530f"} Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.896093 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hbdkd" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.896140 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hbdkd" event={"ID":"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e","Type":"ContainerDied","Data":"0e526352fe86de53d110e5db802aba66020e9334cff6f240f845e8f5d42b814d"} Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.896174 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e526352fe86de53d110e5db802aba66020e9334cff6f240f845e8f5d42b814d" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.897787 4804 generic.go:334] "Generic (PLEG): container finished" podID="ba7e6539-c0c9-40e7-b076-38cc23f233cc" containerID="f94b862fb364184a245162162ecd4a81dc390800d6adb7360015eea9da137ebf" exitCode=0 Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.897836 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0898-account-create-update-6vpd7" event={"ID":"ba7e6539-c0c9-40e7-b076-38cc23f233cc","Type":"ContainerDied","Data":"f94b862fb364184a245162162ecd4a81dc390800d6adb7360015eea9da137ebf"} Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.946092 4804 scope.go:117] "RemoveContainer" containerID="ab28572c13b5040bf4ce2b36ef6ec484d61c5f798e505ab5c93281f67d3def85" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.978731 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-c69gn"] Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.985359 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-c69gn"] Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.227281 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-886b-account-create-update-h84mx" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.286105 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-operator-scripts\") pod \"6f9dbe9b-ced6-453d-9f59-0d92e2a69043\" (UID: \"6f9dbe9b-ced6-453d-9f59-0d92e2a69043\") " Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.286270 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mdgv\" (UniqueName: \"kubernetes.io/projected/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-kube-api-access-5mdgv\") pod \"6f9dbe9b-ced6-453d-9f59-0d92e2a69043\" (UID: \"6f9dbe9b-ced6-453d-9f59-0d92e2a69043\") " Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.286522 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:38 crc kubenswrapper[4804]: E0217 13:45:38.286832 4804 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 13:45:38 crc kubenswrapper[4804]: E0217 13:45:38.286859 4804 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 13:45:38 crc kubenswrapper[4804]: E0217 13:45:38.286911 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift podName:90da6e89-6033-4e42-a5ca-bed1a5ad6a46 nodeName:}" failed. No retries permitted until 2026-02-17 13:45:39.286891688 +0000 UTC m=+1213.398311025 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift") pod "swift-storage-0" (UID: "90da6e89-6033-4e42-a5ca-bed1a5ad6a46") : configmap "swift-ring-files" not found Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.289937 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6f9dbe9b-ced6-453d-9f59-0d92e2a69043" (UID: "6f9dbe9b-ced6-453d-9f59-0d92e2a69043"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.299825 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-kube-api-access-5mdgv" (OuterVolumeSpecName: "kube-api-access-5mdgv") pod "6f9dbe9b-ced6-453d-9f59-0d92e2a69043" (UID: "6f9dbe9b-ced6-453d-9f59-0d92e2a69043"). InnerVolumeSpecName "kube-api-access-5mdgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.388258 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.388300 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mdgv\" (UniqueName: \"kubernetes.io/projected/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-kube-api-access-5mdgv\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.406460 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dl5b9" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.409804 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6m6pk" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.489940 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqldx\" (UniqueName: \"kubernetes.io/projected/2edd89a7-0866-4677-8b25-9654130c6ac5-kube-api-access-nqldx\") pod \"2edd89a7-0866-4677-8b25-9654130c6ac5\" (UID: \"2edd89a7-0866-4677-8b25-9654130c6ac5\") " Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.490089 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bc37bd5-6784-41f8-98de-ef6a43493cd6-operator-scripts\") pod \"4bc37bd5-6784-41f8-98de-ef6a43493cd6\" (UID: \"4bc37bd5-6784-41f8-98de-ef6a43493cd6\") " Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.490161 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qsxw\" (UniqueName: \"kubernetes.io/projected/4bc37bd5-6784-41f8-98de-ef6a43493cd6-kube-api-access-8qsxw\") pod \"4bc37bd5-6784-41f8-98de-ef6a43493cd6\" (UID: \"4bc37bd5-6784-41f8-98de-ef6a43493cd6\") " Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.490221 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2edd89a7-0866-4677-8b25-9654130c6ac5-operator-scripts\") pod \"2edd89a7-0866-4677-8b25-9654130c6ac5\" (UID: \"2edd89a7-0866-4677-8b25-9654130c6ac5\") " Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.490903 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bc37bd5-6784-41f8-98de-ef6a43493cd6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4bc37bd5-6784-41f8-98de-ef6a43493cd6" (UID: "4bc37bd5-6784-41f8-98de-ef6a43493cd6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.490964 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2edd89a7-0866-4677-8b25-9654130c6ac5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2edd89a7-0866-4677-8b25-9654130c6ac5" (UID: "2edd89a7-0866-4677-8b25-9654130c6ac5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.494327 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2edd89a7-0866-4677-8b25-9654130c6ac5-kube-api-access-nqldx" (OuterVolumeSpecName: "kube-api-access-nqldx") pod "2edd89a7-0866-4677-8b25-9654130c6ac5" (UID: "2edd89a7-0866-4677-8b25-9654130c6ac5"). InnerVolumeSpecName "kube-api-access-nqldx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.494494 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bc37bd5-6784-41f8-98de-ef6a43493cd6-kube-api-access-8qsxw" (OuterVolumeSpecName: "kube-api-access-8qsxw") pod "4bc37bd5-6784-41f8-98de-ef6a43493cd6" (UID: "4bc37bd5-6784-41f8-98de-ef6a43493cd6"). InnerVolumeSpecName "kube-api-access-8qsxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.584155 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50142f30-df04-4aa7-85e1-e303286966b7" path="/var/lib/kubelet/pods/50142f30-df04-4aa7-85e1-e303286966b7/volumes" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.604618 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqldx\" (UniqueName: \"kubernetes.io/projected/2edd89a7-0866-4677-8b25-9654130c6ac5-kube-api-access-nqldx\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.604656 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bc37bd5-6784-41f8-98de-ef6a43493cd6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.604670 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qsxw\" (UniqueName: \"kubernetes.io/projected/4bc37bd5-6784-41f8-98de-ef6a43493cd6-kube-api-access-8qsxw\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.604683 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2edd89a7-0866-4677-8b25-9654130c6ac5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.907377 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" event={"ID":"86aca321-b4a3-4d89-ab34-5d311aa11fe9","Type":"ContainerStarted","Data":"f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49"} Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.907486 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.911746 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6m6pk" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.911733 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6m6pk" event={"ID":"2edd89a7-0866-4677-8b25-9654130c6ac5","Type":"ContainerDied","Data":"e323ede546dfc16e7b70b5e9c8b973e450ff37d7035fb2de2ab1e2a23046c82c"} Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.911891 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e323ede546dfc16e7b70b5e9c8b973e450ff37d7035fb2de2ab1e2a23046c82c" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.914984 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-886b-account-create-update-h84mx" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.915050 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-886b-account-create-update-h84mx" event={"ID":"6f9dbe9b-ced6-453d-9f59-0d92e2a69043","Type":"ContainerDied","Data":"f5bcec3b699cdf0f9f0db04a7fa76d46cf5931f896cb8a46357142efa96651b3"} Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.915078 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5bcec3b699cdf0f9f0db04a7fa76d46cf5931f896cb8a46357142efa96651b3" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.917872 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dl5b9" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.918506 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dl5b9" event={"ID":"4bc37bd5-6784-41f8-98de-ef6a43493cd6","Type":"ContainerDied","Data":"f20e76ded39c36574ceb45bd27901b8c625cfc377c6fbf0205827cf374bd32b0"} Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.918530 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f20e76ded39c36574ceb45bd27901b8c625cfc377c6fbf0205827cf374bd32b0" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.936928 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" podStartSLOduration=2.93690596 podStartE2EDuration="2.93690596s" podCreationTimestamp="2026-02-17 13:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:45:38.927499066 +0000 UTC m=+1213.038918413" watchObservedRunningTime="2026-02-17 13:45:38.93690596 +0000 UTC m=+1213.048325297" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.227772 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-v8tb5"] Feb 17 13:45:39 crc kubenswrapper[4804]: E0217 13:45:39.228554 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9dbe9b-ced6-453d-9f59-0d92e2a69043" containerName="mariadb-account-create-update" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.228574 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9dbe9b-ced6-453d-9f59-0d92e2a69043" containerName="mariadb-account-create-update" Feb 17 13:45:39 crc kubenswrapper[4804]: E0217 13:45:39.228607 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2edd89a7-0866-4677-8b25-9654130c6ac5" containerName="mariadb-database-create" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.228615 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2edd89a7-0866-4677-8b25-9654130c6ac5" containerName="mariadb-database-create" Feb 17 13:45:39 crc kubenswrapper[4804]: E0217 13:45:39.228647 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc37bd5-6784-41f8-98de-ef6a43493cd6" containerName="mariadb-database-create" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.228655 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc37bd5-6784-41f8-98de-ef6a43493cd6" containerName="mariadb-database-create" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.228855 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bc37bd5-6784-41f8-98de-ef6a43493cd6" containerName="mariadb-database-create" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.228879 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9dbe9b-ced6-453d-9f59-0d92e2a69043" containerName="mariadb-account-create-update" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.228893 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="2edd89a7-0866-4677-8b25-9654130c6ac5" containerName="mariadb-database-create" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.229573 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v8tb5" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.244176 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-v8tb5"] Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.256319 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0898-account-create-update-6vpd7" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.322417 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f8a9-account-create-update-98wtk"] Feb 17 13:45:39 crc kubenswrapper[4804]: E0217 13:45:39.322717 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba7e6539-c0c9-40e7-b076-38cc23f233cc" containerName="mariadb-account-create-update" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.322728 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba7e6539-c0c9-40e7-b076-38cc23f233cc" containerName="mariadb-account-create-update" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.322895 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba7e6539-c0c9-40e7-b076-38cc23f233cc" containerName="mariadb-account-create-update" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.323845 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f8a9-account-create-update-98wtk" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.329154 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.329826 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7e6539-c0c9-40e7-b076-38cc23f233cc-operator-scripts\") pod \"ba7e6539-c0c9-40e7-b076-38cc23f233cc\" (UID: \"ba7e6539-c0c9-40e7-b076-38cc23f233cc\") " Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.329911 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rvmk\" (UniqueName: \"kubernetes.io/projected/ba7e6539-c0c9-40e7-b076-38cc23f233cc-kube-api-access-6rvmk\") pod \"ba7e6539-c0c9-40e7-b076-38cc23f233cc\" (UID: \"ba7e6539-c0c9-40e7-b076-38cc23f233cc\") " Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.330242 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.330283 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrz7q\" (UniqueName: \"kubernetes.io/projected/4c8ee09a-97bd-4497-81cd-2f0f4952d996-kube-api-access-wrz7q\") pod \"glance-db-create-v8tb5\" (UID: \"4c8ee09a-97bd-4497-81cd-2f0f4952d996\") " pod="openstack/glance-db-create-v8tb5" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.330315 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c8ee09a-97bd-4497-81cd-2f0f4952d996-operator-scripts\") pod \"glance-db-create-v8tb5\" (UID: \"4c8ee09a-97bd-4497-81cd-2f0f4952d996\") " pod="openstack/glance-db-create-v8tb5" Feb 17 13:45:39 crc kubenswrapper[4804]: E0217 13:45:39.331015 4804 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 13:45:39 crc kubenswrapper[4804]: E0217 13:45:39.331038 4804 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 13:45:39 crc kubenswrapper[4804]: E0217 13:45:39.331084 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift podName:90da6e89-6033-4e42-a5ca-bed1a5ad6a46 nodeName:}" failed. No retries permitted until 2026-02-17 13:45:41.331064585 +0000 UTC m=+1215.442483982 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift") pod "swift-storage-0" (UID: "90da6e89-6033-4e42-a5ca-bed1a5ad6a46") : configmap "swift-ring-files" not found Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.331550 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba7e6539-c0c9-40e7-b076-38cc23f233cc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba7e6539-c0c9-40e7-b076-38cc23f233cc" (UID: "ba7e6539-c0c9-40e7-b076-38cc23f233cc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.336390 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba7e6539-c0c9-40e7-b076-38cc23f233cc-kube-api-access-6rvmk" (OuterVolumeSpecName: "kube-api-access-6rvmk") pod "ba7e6539-c0c9-40e7-b076-38cc23f233cc" (UID: "ba7e6539-c0c9-40e7-b076-38cc23f233cc"). InnerVolumeSpecName "kube-api-access-6rvmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.340848 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f8a9-account-create-update-98wtk"] Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.432313 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrz7q\" (UniqueName: \"kubernetes.io/projected/4c8ee09a-97bd-4497-81cd-2f0f4952d996-kube-api-access-wrz7q\") pod \"glance-db-create-v8tb5\" (UID: \"4c8ee09a-97bd-4497-81cd-2f0f4952d996\") " pod="openstack/glance-db-create-v8tb5" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.432372 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj5wj\" (UniqueName: \"kubernetes.io/projected/b0597f43-df0a-427f-b045-e6859849a0d6-kube-api-access-gj5wj\") pod \"glance-f8a9-account-create-update-98wtk\" (UID: \"b0597f43-df0a-427f-b045-e6859849a0d6\") " pod="openstack/glance-f8a9-account-create-update-98wtk" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.432412 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c8ee09a-97bd-4497-81cd-2f0f4952d996-operator-scripts\") pod \"glance-db-create-v8tb5\" (UID: \"4c8ee09a-97bd-4497-81cd-2f0f4952d996\") " pod="openstack/glance-db-create-v8tb5" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.432439 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0597f43-df0a-427f-b045-e6859849a0d6-operator-scripts\") pod \"glance-f8a9-account-create-update-98wtk\" (UID: \"b0597f43-df0a-427f-b045-e6859849a0d6\") " pod="openstack/glance-f8a9-account-create-update-98wtk" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.432510 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7e6539-c0c9-40e7-b076-38cc23f233cc-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.432524 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rvmk\" (UniqueName: \"kubernetes.io/projected/ba7e6539-c0c9-40e7-b076-38cc23f233cc-kube-api-access-6rvmk\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.433375 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c8ee09a-97bd-4497-81cd-2f0f4952d996-operator-scripts\") pod \"glance-db-create-v8tb5\" (UID: \"4c8ee09a-97bd-4497-81cd-2f0f4952d996\") " pod="openstack/glance-db-create-v8tb5" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.451582 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrz7q\" (UniqueName: \"kubernetes.io/projected/4c8ee09a-97bd-4497-81cd-2f0f4952d996-kube-api-access-wrz7q\") pod \"glance-db-create-v8tb5\" (UID: \"4c8ee09a-97bd-4497-81cd-2f0f4952d996\") " pod="openstack/glance-db-create-v8tb5" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.534862 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj5wj\" (UniqueName: \"kubernetes.io/projected/b0597f43-df0a-427f-b045-e6859849a0d6-kube-api-access-gj5wj\") pod \"glance-f8a9-account-create-update-98wtk\" (UID: \"b0597f43-df0a-427f-b045-e6859849a0d6\") " pod="openstack/glance-f8a9-account-create-update-98wtk" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.534936 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0597f43-df0a-427f-b045-e6859849a0d6-operator-scripts\") pod \"glance-f8a9-account-create-update-98wtk\" (UID: \"b0597f43-df0a-427f-b045-e6859849a0d6\") " pod="openstack/glance-f8a9-account-create-update-98wtk" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.535932 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0597f43-df0a-427f-b045-e6859849a0d6-operator-scripts\") pod \"glance-f8a9-account-create-update-98wtk\" (UID: \"b0597f43-df0a-427f-b045-e6859849a0d6\") " pod="openstack/glance-f8a9-account-create-update-98wtk" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.553975 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj5wj\" (UniqueName: \"kubernetes.io/projected/b0597f43-df0a-427f-b045-e6859849a0d6-kube-api-access-gj5wj\") pod \"glance-f8a9-account-create-update-98wtk\" (UID: \"b0597f43-df0a-427f-b045-e6859849a0d6\") " pod="openstack/glance-f8a9-account-create-update-98wtk" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.568223 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v8tb5" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.684737 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f8a9-account-create-update-98wtk" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.934174 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0898-account-create-update-6vpd7" event={"ID":"ba7e6539-c0c9-40e7-b076-38cc23f233cc","Type":"ContainerDied","Data":"a83919020b9aabb5aba4df6b823708e2580f5074e6ede1893518c8eddc8797b1"} Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.934619 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a83919020b9aabb5aba4df6b823708e2580f5074e6ede1893518c8eddc8797b1" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.934223 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0898-account-create-update-6vpd7" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.999808 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-v8tb5"] Feb 17 13:45:40 crc kubenswrapper[4804]: I0217 13:45:40.157874 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f8a9-account-create-update-98wtk"] Feb 17 13:45:40 crc kubenswrapper[4804]: W0217 13:45:40.160533 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0597f43_df0a_427f_b045_e6859849a0d6.slice/crio-25125cc9707c8d63ca516ea3904a1fab397302333018e923c2c08f764dea015f WatchSource:0}: Error finding container 25125cc9707c8d63ca516ea3904a1fab397302333018e923c2c08f764dea015f: Status 404 returned error can't find the container with id 25125cc9707c8d63ca516ea3904a1fab397302333018e923c2c08f764dea015f Feb 17 13:45:40 crc kubenswrapper[4804]: I0217 13:45:40.944401 4804 generic.go:334] "Generic (PLEG): container finished" podID="b0597f43-df0a-427f-b045-e6859849a0d6" containerID="523c2a0dce1e6efc07d04ec334853ccdc0d1e041c66ee6b003b630197674d70f" exitCode=0 Feb 17 13:45:40 crc kubenswrapper[4804]: I0217 13:45:40.944625 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f8a9-account-create-update-98wtk" event={"ID":"b0597f43-df0a-427f-b045-e6859849a0d6","Type":"ContainerDied","Data":"523c2a0dce1e6efc07d04ec334853ccdc0d1e041c66ee6b003b630197674d70f"} Feb 17 13:45:40 crc kubenswrapper[4804]: I0217 13:45:40.944789 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f8a9-account-create-update-98wtk" event={"ID":"b0597f43-df0a-427f-b045-e6859849a0d6","Type":"ContainerStarted","Data":"25125cc9707c8d63ca516ea3904a1fab397302333018e923c2c08f764dea015f"} Feb 17 13:45:40 crc kubenswrapper[4804]: I0217 13:45:40.946751 4804 generic.go:334] "Generic (PLEG): container finished" podID="4c8ee09a-97bd-4497-81cd-2f0f4952d996" containerID="5707e03ce1413559d6e451944a8178ed7c1374503c523227f07af12a0d1deda1" exitCode=0 Feb 17 13:45:40 crc kubenswrapper[4804]: I0217 13:45:40.946786 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v8tb5" event={"ID":"4c8ee09a-97bd-4497-81cd-2f0f4952d996","Type":"ContainerDied","Data":"5707e03ce1413559d6e451944a8178ed7c1374503c523227f07af12a0d1deda1"} Feb 17 13:45:40 crc kubenswrapper[4804]: I0217 13:45:40.946805 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v8tb5" event={"ID":"4c8ee09a-97bd-4497-81cd-2f0f4952d996","Type":"ContainerStarted","Data":"edb5c73dae0b867ad78d1e3312eb0f89bf13d520b78c11a76297e212dcf745ce"} Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.414943 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:41 crc kubenswrapper[4804]: E0217 13:45:41.415256 4804 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 13:45:41 crc kubenswrapper[4804]: E0217 13:45:41.415291 4804 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 13:45:41 crc kubenswrapper[4804]: E0217 13:45:41.415366 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift podName:90da6e89-6033-4e42-a5ca-bed1a5ad6a46 nodeName:}" failed. No retries permitted until 2026-02-17 13:45:45.41534205 +0000 UTC m=+1219.526761387 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift") pod "swift-storage-0" (UID: "90da6e89-6033-4e42-a5ca-bed1a5ad6a46") : configmap "swift-ring-files" not found Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.471461 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-mv8w5"] Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.472474 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.474297 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.474710 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.475893 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.486216 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mv8w5"] Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.619890 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-etc-swift\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.619955 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85nm9\" (UniqueName: \"kubernetes.io/projected/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-kube-api-access-85nm9\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.619977 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-ring-data-devices\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.620008 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-combined-ca-bundle\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.620193 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-dispersionconf\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.620351 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-swiftconf\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.620486 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-scripts\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.721907 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-dispersionconf\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.721985 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-swiftconf\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.722022 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-scripts\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.722136 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-etc-swift\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.722180 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85nm9\" (UniqueName: \"kubernetes.io/projected/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-kube-api-access-85nm9\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.722206 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-ring-data-devices\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.722298 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-combined-ca-bundle\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.722902 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-etc-swift\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.723028 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-scripts\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.723278 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-ring-data-devices\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.728954 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-combined-ca-bundle\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.730473 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-dispersionconf\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.730696 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-swiftconf\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.745732 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85nm9\" (UniqueName: \"kubernetes.io/projected/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-kube-api-access-85nm9\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.798791 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.213005 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mv8w5"] Feb 17 13:45:42 crc kubenswrapper[4804]: W0217 13:45:42.221765 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41aa78f0_ef58_4a36_b1f9_ce222fd8e1e2.slice/crio-a563f1cc47989fe64effde3ad5ba60476f9907f0b572451fd849f90b4a6e4fa8 WatchSource:0}: Error finding container a563f1cc47989fe64effde3ad5ba60476f9907f0b572451fd849f90b4a6e4fa8: Status 404 returned error can't find the container with id a563f1cc47989fe64effde3ad5ba60476f9907f0b572451fd849f90b4a6e4fa8 Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.255706 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v8tb5" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.317203 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f8a9-account-create-update-98wtk" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.433015 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c8ee09a-97bd-4497-81cd-2f0f4952d996-operator-scripts\") pod \"4c8ee09a-97bd-4497-81cd-2f0f4952d996\" (UID: \"4c8ee09a-97bd-4497-81cd-2f0f4952d996\") " Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.433114 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrz7q\" (UniqueName: \"kubernetes.io/projected/4c8ee09a-97bd-4497-81cd-2f0f4952d996-kube-api-access-wrz7q\") pod \"4c8ee09a-97bd-4497-81cd-2f0f4952d996\" (UID: \"4c8ee09a-97bd-4497-81cd-2f0f4952d996\") " Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.433230 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0597f43-df0a-427f-b045-e6859849a0d6-operator-scripts\") pod \"b0597f43-df0a-427f-b045-e6859849a0d6\" (UID: \"b0597f43-df0a-427f-b045-e6859849a0d6\") " Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.433295 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj5wj\" (UniqueName: \"kubernetes.io/projected/b0597f43-df0a-427f-b045-e6859849a0d6-kube-api-access-gj5wj\") pod \"b0597f43-df0a-427f-b045-e6859849a0d6\" (UID: \"b0597f43-df0a-427f-b045-e6859849a0d6\") " Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.434109 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0597f43-df0a-427f-b045-e6859849a0d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b0597f43-df0a-427f-b045-e6859849a0d6" (UID: "b0597f43-df0a-427f-b045-e6859849a0d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.434438 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c8ee09a-97bd-4497-81cd-2f0f4952d996-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4c8ee09a-97bd-4497-81cd-2f0f4952d996" (UID: "4c8ee09a-97bd-4497-81cd-2f0f4952d996"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.438350 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0597f43-df0a-427f-b045-e6859849a0d6-kube-api-access-gj5wj" (OuterVolumeSpecName: "kube-api-access-gj5wj") pod "b0597f43-df0a-427f-b045-e6859849a0d6" (UID: "b0597f43-df0a-427f-b045-e6859849a0d6"). InnerVolumeSpecName "kube-api-access-gj5wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.438412 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c8ee09a-97bd-4497-81cd-2f0f4952d996-kube-api-access-wrz7q" (OuterVolumeSpecName: "kube-api-access-wrz7q") pod "4c8ee09a-97bd-4497-81cd-2f0f4952d996" (UID: "4c8ee09a-97bd-4497-81cd-2f0f4952d996"). InnerVolumeSpecName "kube-api-access-wrz7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.535557 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c8ee09a-97bd-4497-81cd-2f0f4952d996-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.535590 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrz7q\" (UniqueName: \"kubernetes.io/projected/4c8ee09a-97bd-4497-81cd-2f0f4952d996-kube-api-access-wrz7q\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.535601 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0597f43-df0a-427f-b045-e6859849a0d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.535610 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gj5wj\" (UniqueName: \"kubernetes.io/projected/b0597f43-df0a-427f-b045-e6859849a0d6-kube-api-access-gj5wj\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.623224 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-hbdkd"] Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.629567 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-hbdkd"] Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.961156 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mv8w5" event={"ID":"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2","Type":"ContainerStarted","Data":"a563f1cc47989fe64effde3ad5ba60476f9907f0b572451fd849f90b4a6e4fa8"} Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.963042 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f8a9-account-create-update-98wtk" event={"ID":"b0597f43-df0a-427f-b045-e6859849a0d6","Type":"ContainerDied","Data":"25125cc9707c8d63ca516ea3904a1fab397302333018e923c2c08f764dea015f"} Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.963059 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f8a9-account-create-update-98wtk" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.963068 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25125cc9707c8d63ca516ea3904a1fab397302333018e923c2c08f764dea015f" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.966391 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v8tb5" event={"ID":"4c8ee09a-97bd-4497-81cd-2f0f4952d996","Type":"ContainerDied","Data":"edb5c73dae0b867ad78d1e3312eb0f89bf13d520b78c11a76297e212dcf745ce"} Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.966523 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v8tb5" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.966521 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edb5c73dae0b867ad78d1e3312eb0f89bf13d520b78c11a76297e212dcf745ce" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.504557 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-lpd9f"] Feb 17 13:45:44 crc kubenswrapper[4804]: E0217 13:45:44.506690 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0597f43-df0a-427f-b045-e6859849a0d6" containerName="mariadb-account-create-update" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.506714 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0597f43-df0a-427f-b045-e6859849a0d6" containerName="mariadb-account-create-update" Feb 17 13:45:44 crc kubenswrapper[4804]: E0217 13:45:44.506738 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c8ee09a-97bd-4497-81cd-2f0f4952d996" containerName="mariadb-database-create" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.506744 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c8ee09a-97bd-4497-81cd-2f0f4952d996" containerName="mariadb-database-create" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.506956 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0597f43-df0a-427f-b045-e6859849a0d6" containerName="mariadb-account-create-update" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.506981 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c8ee09a-97bd-4497-81cd-2f0f4952d996" containerName="mariadb-database-create" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.507645 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.513855 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.514576 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-r5s28" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.520183 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-lpd9f"] Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.585821 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed593eeb-17c6-42cc-a392-9fbb1f3aef6e" path="/var/lib/kubelet/pods/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e/volumes" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.671195 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-config-data\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.671303 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-combined-ca-bundle\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.671355 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkcbn\" (UniqueName: \"kubernetes.io/projected/dfb6c8ec-f280-4566-bb37-b286119956b5-kube-api-access-zkcbn\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.671631 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-db-sync-config-data\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.774527 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-config-data\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.774627 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-combined-ca-bundle\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.774708 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkcbn\" (UniqueName: \"kubernetes.io/projected/dfb6c8ec-f280-4566-bb37-b286119956b5-kube-api-access-zkcbn\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.774871 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-db-sync-config-data\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.782519 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-config-data\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.783505 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-db-sync-config-data\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.784177 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-combined-ca-bundle\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.797335 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkcbn\" (UniqueName: \"kubernetes.io/projected/dfb6c8ec-f280-4566-bb37-b286119956b5-kube-api-access-zkcbn\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.828366 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:45 crc kubenswrapper[4804]: I0217 13:45:45.487440 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:45 crc kubenswrapper[4804]: E0217 13:45:45.487669 4804 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 13:45:45 crc kubenswrapper[4804]: E0217 13:45:45.488166 4804 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 13:45:45 crc kubenswrapper[4804]: E0217 13:45:45.488250 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift podName:90da6e89-6033-4e42-a5ca-bed1a5ad6a46 nodeName:}" failed. No retries permitted until 2026-02-17 13:45:53.488227173 +0000 UTC m=+1227.599646510 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift") pod "swift-storage-0" (UID: "90da6e89-6033-4e42-a5ca-bed1a5ad6a46") : configmap "swift-ring-files" not found Feb 17 13:45:45 crc kubenswrapper[4804]: I0217 13:45:45.715552 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-lpd9f"] Feb 17 13:45:45 crc kubenswrapper[4804]: W0217 13:45:45.716525 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfb6c8ec_f280_4566_bb37_b286119956b5.slice/crio-8b44ab8e952e1f3a0a808b705734302f9d0fba531b5c5b5df0002ed2ff150b18 WatchSource:0}: Error finding container 8b44ab8e952e1f3a0a808b705734302f9d0fba531b5c5b5df0002ed2ff150b18: Status 404 returned error can't find the container with id 8b44ab8e952e1f3a0a808b705734302f9d0fba531b5c5b5df0002ed2ff150b18 Feb 17 13:45:45 crc kubenswrapper[4804]: I0217 13:45:45.998886 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lpd9f" event={"ID":"dfb6c8ec-f280-4566-bb37-b286119956b5","Type":"ContainerStarted","Data":"8b44ab8e952e1f3a0a808b705734302f9d0fba531b5c5b5df0002ed2ff150b18"} Feb 17 13:45:46 crc kubenswrapper[4804]: I0217 13:45:46.000629 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mv8w5" event={"ID":"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2","Type":"ContainerStarted","Data":"acbd8ddba5d51200f8256011420ddf0cc657b7bccf8bce2bdfa4bb2a827a329a"} Feb 17 13:45:46 crc kubenswrapper[4804]: I0217 13:45:46.020439 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-mv8w5" podStartSLOduration=2.018387553 podStartE2EDuration="5.020421643s" podCreationTimestamp="2026-02-17 13:45:41 +0000 UTC" firstStartedPulling="2026-02-17 13:45:42.225857764 +0000 UTC m=+1216.337277101" lastFinishedPulling="2026-02-17 13:45:45.227891824 +0000 UTC m=+1219.339311191" observedRunningTime="2026-02-17 13:45:46.015391326 +0000 UTC m=+1220.126810663" watchObservedRunningTime="2026-02-17 13:45:46.020421643 +0000 UTC m=+1220.131840980" Feb 17 13:45:46 crc kubenswrapper[4804]: I0217 13:45:46.760632 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.016364 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.110691 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-s5nsf"] Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.111057 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-s5nsf" podUID="f2e35955-0967-4a9c-b4e5-68316c98d58f" containerName="dnsmasq-dns" containerID="cri-o://4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2" gracePeriod=10 Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.548458 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.627601 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-sb\") pod \"f2e35955-0967-4a9c-b4e5-68316c98d58f\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.627662 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-config\") pod \"f2e35955-0967-4a9c-b4e5-68316c98d58f\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.627686 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-dns-svc\") pod \"f2e35955-0967-4a9c-b4e5-68316c98d58f\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.627704 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-nb\") pod \"f2e35955-0967-4a9c-b4e5-68316c98d58f\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.668765 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-c8wmz"] Feb 17 13:45:47 crc kubenswrapper[4804]: E0217 13:45:47.671418 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2e35955-0967-4a9c-b4e5-68316c98d58f" containerName="dnsmasq-dns" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.671461 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2e35955-0967-4a9c-b4e5-68316c98d58f" containerName="dnsmasq-dns" Feb 17 13:45:47 crc kubenswrapper[4804]: E0217 13:45:47.671602 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2e35955-0967-4a9c-b4e5-68316c98d58f" containerName="init" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.671611 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2e35955-0967-4a9c-b4e5-68316c98d58f" containerName="init" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.672229 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2e35955-0967-4a9c-b4e5-68316c98d58f" containerName="dnsmasq-dns" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.672989 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c8wmz" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.681908 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.690200 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-c8wmz"] Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.709095 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f2e35955-0967-4a9c-b4e5-68316c98d58f" (UID: "f2e35955-0967-4a9c-b4e5-68316c98d58f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.718726 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f2e35955-0967-4a9c-b4e5-68316c98d58f" (UID: "f2e35955-0967-4a9c-b4e5-68316c98d58f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.720009 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f2e35955-0967-4a9c-b4e5-68316c98d58f" (UID: "f2e35955-0967-4a9c-b4e5-68316c98d58f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.723018 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-config" (OuterVolumeSpecName: "config") pod "f2e35955-0967-4a9c-b4e5-68316c98d58f" (UID: "f2e35955-0967-4a9c-b4e5-68316c98d58f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.728733 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7ndx\" (UniqueName: \"kubernetes.io/projected/f2e35955-0967-4a9c-b4e5-68316c98d58f-kube-api-access-n7ndx\") pod \"f2e35955-0967-4a9c-b4e5-68316c98d58f\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.731003 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.731041 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.731052 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.731060 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.733142 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2e35955-0967-4a9c-b4e5-68316c98d58f-kube-api-access-n7ndx" (OuterVolumeSpecName: "kube-api-access-n7ndx") pod "f2e35955-0967-4a9c-b4e5-68316c98d58f" (UID: "f2e35955-0967-4a9c-b4e5-68316c98d58f"). InnerVolumeSpecName "kube-api-access-n7ndx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.832711 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kv9q\" (UniqueName: \"kubernetes.io/projected/6c3b824f-ae3d-4681-8b14-16099a2643d5-kube-api-access-4kv9q\") pod \"root-account-create-update-c8wmz\" (UID: \"6c3b824f-ae3d-4681-8b14-16099a2643d5\") " pod="openstack/root-account-create-update-c8wmz" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.832887 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3b824f-ae3d-4681-8b14-16099a2643d5-operator-scripts\") pod \"root-account-create-update-c8wmz\" (UID: \"6c3b824f-ae3d-4681-8b14-16099a2643d5\") " pod="openstack/root-account-create-update-c8wmz" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.832990 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7ndx\" (UniqueName: \"kubernetes.io/projected/f2e35955-0967-4a9c-b4e5-68316c98d58f-kube-api-access-n7ndx\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.934625 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kv9q\" (UniqueName: \"kubernetes.io/projected/6c3b824f-ae3d-4681-8b14-16099a2643d5-kube-api-access-4kv9q\") pod \"root-account-create-update-c8wmz\" (UID: \"6c3b824f-ae3d-4681-8b14-16099a2643d5\") " pod="openstack/root-account-create-update-c8wmz" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.934783 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3b824f-ae3d-4681-8b14-16099a2643d5-operator-scripts\") pod \"root-account-create-update-c8wmz\" (UID: \"6c3b824f-ae3d-4681-8b14-16099a2643d5\") " pod="openstack/root-account-create-update-c8wmz" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.935447 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3b824f-ae3d-4681-8b14-16099a2643d5-operator-scripts\") pod \"root-account-create-update-c8wmz\" (UID: \"6c3b824f-ae3d-4681-8b14-16099a2643d5\") " pod="openstack/root-account-create-update-c8wmz" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.953361 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kv9q\" (UniqueName: \"kubernetes.io/projected/6c3b824f-ae3d-4681-8b14-16099a2643d5-kube-api-access-4kv9q\") pod \"root-account-create-update-c8wmz\" (UID: \"6c3b824f-ae3d-4681-8b14-16099a2643d5\") " pod="openstack/root-account-create-update-c8wmz" Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.021260 4804 generic.go:334] "Generic (PLEG): container finished" podID="bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" containerID="de02dbb74f45601647c918b390d5f93cfff604870702fca3316aca846c6db162" exitCode=0 Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.021330 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad","Type":"ContainerDied","Data":"de02dbb74f45601647c918b390d5f93cfff604870702fca3316aca846c6db162"} Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.025556 4804 generic.go:334] "Generic (PLEG): container finished" podID="f2e35955-0967-4a9c-b4e5-68316c98d58f" containerID="4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2" exitCode=0 Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.025690 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-s5nsf" event={"ID":"f2e35955-0967-4a9c-b4e5-68316c98d58f","Type":"ContainerDied","Data":"4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2"} Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.025732 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-s5nsf" event={"ID":"f2e35955-0967-4a9c-b4e5-68316c98d58f","Type":"ContainerDied","Data":"2fd594aa91077237ced828ad76ea96d9e73ca61204c2b63332a894fe7e26b921"} Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.025773 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.025790 4804 scope.go:117] "RemoveContainer" containerID="4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2" Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.069476 4804 scope.go:117] "RemoveContainer" containerID="84e8ddfc4492066363704b53f96c7ec893c142c41860e555ddf4bd60f09fd51e" Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.070659 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c8wmz" Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.075731 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-s5nsf"] Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.084615 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-s5nsf"] Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.095369 4804 scope.go:117] "RemoveContainer" containerID="4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2" Feb 17 13:45:48 crc kubenswrapper[4804]: E0217 13:45:48.095756 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2\": container with ID starting with 4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2 not found: ID does not exist" containerID="4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2" Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.095800 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2"} err="failed to get container status \"4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2\": rpc error: code = NotFound desc = could not find container \"4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2\": container with ID starting with 4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2 not found: ID does not exist" Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.095834 4804 scope.go:117] "RemoveContainer" containerID="84e8ddfc4492066363704b53f96c7ec893c142c41860e555ddf4bd60f09fd51e" Feb 17 13:45:48 crc kubenswrapper[4804]: E0217 13:45:48.096312 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84e8ddfc4492066363704b53f96c7ec893c142c41860e555ddf4bd60f09fd51e\": container with ID starting with 84e8ddfc4492066363704b53f96c7ec893c142c41860e555ddf4bd60f09fd51e not found: ID does not exist" containerID="84e8ddfc4492066363704b53f96c7ec893c142c41860e555ddf4bd60f09fd51e" Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.096350 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84e8ddfc4492066363704b53f96c7ec893c142c41860e555ddf4bd60f09fd51e"} err="failed to get container status \"84e8ddfc4492066363704b53f96c7ec893c142c41860e555ddf4bd60f09fd51e\": rpc error: code = NotFound desc = could not find container \"84e8ddfc4492066363704b53f96c7ec893c142c41860e555ddf4bd60f09fd51e\": container with ID starting with 84e8ddfc4492066363704b53f96c7ec893c142c41860e555ddf4bd60f09fd51e not found: ID does not exist" Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.510349 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-c8wmz"] Feb 17 13:45:48 crc kubenswrapper[4804]: W0217 13:45:48.517802 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c3b824f_ae3d_4681_8b14_16099a2643d5.slice/crio-756263fa82eb7a6414b2f41ee32414c158acfb11d5c99ff9536bacfe978bdb28 WatchSource:0}: Error finding container 756263fa82eb7a6414b2f41ee32414c158acfb11d5c99ff9536bacfe978bdb28: Status 404 returned error can't find the container with id 756263fa82eb7a6414b2f41ee32414c158acfb11d5c99ff9536bacfe978bdb28 Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.583402 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2e35955-0967-4a9c-b4e5-68316c98d58f" path="/var/lib/kubelet/pods/f2e35955-0967-4a9c-b4e5-68316c98d58f/volumes" Feb 17 13:45:49 crc kubenswrapper[4804]: I0217 13:45:49.033796 4804 generic.go:334] "Generic (PLEG): container finished" podID="6c3b824f-ae3d-4681-8b14-16099a2643d5" containerID="a042fc58bb60ee18221f1218414ff109d197e288fe316a76abf5d21b41df0c21" exitCode=0 Feb 17 13:45:49 crc kubenswrapper[4804]: I0217 13:45:49.033862 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c8wmz" event={"ID":"6c3b824f-ae3d-4681-8b14-16099a2643d5","Type":"ContainerDied","Data":"a042fc58bb60ee18221f1218414ff109d197e288fe316a76abf5d21b41df0c21"} Feb 17 13:45:49 crc kubenswrapper[4804]: I0217 13:45:49.033900 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c8wmz" event={"ID":"6c3b824f-ae3d-4681-8b14-16099a2643d5","Type":"ContainerStarted","Data":"756263fa82eb7a6414b2f41ee32414c158acfb11d5c99ff9536bacfe978bdb28"} Feb 17 13:45:49 crc kubenswrapper[4804]: I0217 13:45:49.035265 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad","Type":"ContainerStarted","Data":"e223242f2f9a06365d51771062ed7df23bbf7ec9bda6057f41d25fb9aed813cb"} Feb 17 13:45:49 crc kubenswrapper[4804]: I0217 13:45:49.035931 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:45:49 crc kubenswrapper[4804]: I0217 13:45:49.077438 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.774246489 podStartE2EDuration="59.077410486s" podCreationTimestamp="2026-02-17 13:44:50 +0000 UTC" firstStartedPulling="2026-02-17 13:44:52.619459199 +0000 UTC m=+1166.730878536" lastFinishedPulling="2026-02-17 13:45:13.922623196 +0000 UTC m=+1188.034042533" observedRunningTime="2026-02-17 13:45:49.073156293 +0000 UTC m=+1223.184575650" watchObservedRunningTime="2026-02-17 13:45:49.077410486 +0000 UTC m=+1223.188829833" Feb 17 13:45:50 crc kubenswrapper[4804]: I0217 13:45:50.438090 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c8wmz" Feb 17 13:45:50 crc kubenswrapper[4804]: I0217 13:45:50.587755 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3b824f-ae3d-4681-8b14-16099a2643d5-operator-scripts\") pod \"6c3b824f-ae3d-4681-8b14-16099a2643d5\" (UID: \"6c3b824f-ae3d-4681-8b14-16099a2643d5\") " Feb 17 13:45:50 crc kubenswrapper[4804]: I0217 13:45:50.588492 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kv9q\" (UniqueName: \"kubernetes.io/projected/6c3b824f-ae3d-4681-8b14-16099a2643d5-kube-api-access-4kv9q\") pod \"6c3b824f-ae3d-4681-8b14-16099a2643d5\" (UID: \"6c3b824f-ae3d-4681-8b14-16099a2643d5\") " Feb 17 13:45:50 crc kubenswrapper[4804]: I0217 13:45:50.588640 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c3b824f-ae3d-4681-8b14-16099a2643d5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6c3b824f-ae3d-4681-8b14-16099a2643d5" (UID: "6c3b824f-ae3d-4681-8b14-16099a2643d5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:50 crc kubenswrapper[4804]: I0217 13:45:50.589161 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3b824f-ae3d-4681-8b14-16099a2643d5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:50 crc kubenswrapper[4804]: I0217 13:45:50.609441 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c3b824f-ae3d-4681-8b14-16099a2643d5-kube-api-access-4kv9q" (OuterVolumeSpecName: "kube-api-access-4kv9q") pod "6c3b824f-ae3d-4681-8b14-16099a2643d5" (UID: "6c3b824f-ae3d-4681-8b14-16099a2643d5"). InnerVolumeSpecName "kube-api-access-4kv9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:50 crc kubenswrapper[4804]: I0217 13:45:50.694392 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kv9q\" (UniqueName: \"kubernetes.io/projected/6c3b824f-ae3d-4681-8b14-16099a2643d5-kube-api-access-4kv9q\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:51 crc kubenswrapper[4804]: I0217 13:45:51.077292 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c8wmz" event={"ID":"6c3b824f-ae3d-4681-8b14-16099a2643d5","Type":"ContainerDied","Data":"756263fa82eb7a6414b2f41ee32414c158acfb11d5c99ff9536bacfe978bdb28"} Feb 17 13:45:51 crc kubenswrapper[4804]: I0217 13:45:51.077545 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="756263fa82eb7a6414b2f41ee32414c158acfb11d5c99ff9536bacfe978bdb28" Feb 17 13:45:51 crc kubenswrapper[4804]: I0217 13:45:51.077485 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c8wmz" Feb 17 13:45:53 crc kubenswrapper[4804]: I0217 13:45:53.100132 4804 generic.go:334] "Generic (PLEG): container finished" podID="41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2" containerID="acbd8ddba5d51200f8256011420ddf0cc657b7bccf8bce2bdfa4bb2a827a329a" exitCode=0 Feb 17 13:45:53 crc kubenswrapper[4804]: I0217 13:45:53.100274 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mv8w5" event={"ID":"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2","Type":"ContainerDied","Data":"acbd8ddba5d51200f8256011420ddf0cc657b7bccf8bce2bdfa4bb2a827a329a"} Feb 17 13:45:53 crc kubenswrapper[4804]: I0217 13:45:53.544820 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:53 crc kubenswrapper[4804]: I0217 13:45:53.563336 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:53 crc kubenswrapper[4804]: I0217 13:45:53.839900 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.408330 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rzcfd" podUID="9c049787-03d2-4679-8705-ec2cd1ad8141" containerName="ovn-controller" probeResult="failure" output=< Feb 17 13:45:54 crc kubenswrapper[4804]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 17 13:45:54 crc kubenswrapper[4804]: > Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.460802 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.463783 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.698106 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rzcfd-config-vlgpc"] Feb 17 13:45:54 crc kubenswrapper[4804]: E0217 13:45:54.703442 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c3b824f-ae3d-4681-8b14-16099a2643d5" containerName="mariadb-account-create-update" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.703599 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c3b824f-ae3d-4681-8b14-16099a2643d5" containerName="mariadb-account-create-update" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.703812 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c3b824f-ae3d-4681-8b14-16099a2643d5" containerName="mariadb-account-create-update" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.704495 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.713593 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.758969 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rzcfd-config-vlgpc"] Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.872253 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-log-ovn\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.872356 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-additional-scripts\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.872399 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgdkn\" (UniqueName: \"kubernetes.io/projected/5bce6d8f-9e27-4d98-8003-f5e7b368f816-kube-api-access-qgdkn\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.872457 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run-ovn\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.872665 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.872761 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-scripts\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.975249 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-scripts\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.975390 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-log-ovn\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.975436 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-additional-scripts\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.975466 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgdkn\" (UniqueName: \"kubernetes.io/projected/5bce6d8f-9e27-4d98-8003-f5e7b368f816-kube-api-access-qgdkn\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.975532 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run-ovn\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.975578 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.975923 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-log-ovn\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.975954 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run-ovn\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.976037 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.977018 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-additional-scripts\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.978728 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-scripts\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:55 crc kubenswrapper[4804]: I0217 13:45:55.002151 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgdkn\" (UniqueName: \"kubernetes.io/projected/5bce6d8f-9e27-4d98-8003-f5e7b368f816-kube-api-access-qgdkn\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:55 crc kubenswrapper[4804]: I0217 13:45:55.032432 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:55 crc kubenswrapper[4804]: I0217 13:45:55.835589 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:45:55 crc kubenswrapper[4804]: I0217 13:45:55.835933 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:45:56 crc kubenswrapper[4804]: I0217 13:45:56.129260 4804 generic.go:334] "Generic (PLEG): container finished" podID="7705a06d-bc27-4686-9ca4-4aae248ead07" containerID="762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993" exitCode=0 Feb 17 13:45:56 crc kubenswrapper[4804]: I0217 13:45:56.129302 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7705a06d-bc27-4686-9ca4-4aae248ead07","Type":"ContainerDied","Data":"762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993"} Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.581130 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.721368 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-dispersionconf\") pod \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.722046 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85nm9\" (UniqueName: \"kubernetes.io/projected/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-kube-api-access-85nm9\") pod \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.722659 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-ring-data-devices\") pod \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.722774 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-scripts\") pod \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.722870 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-etc-swift\") pod \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.722969 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-swiftconf\") pod \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.723066 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-combined-ca-bundle\") pod \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.723288 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2" (UID: "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.723645 4804 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.724338 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2" (UID: "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.727091 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-kube-api-access-85nm9" (OuterVolumeSpecName: "kube-api-access-85nm9") pod "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2" (UID: "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2"). InnerVolumeSpecName "kube-api-access-85nm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.736274 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2" (UID: "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.742978 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-scripts" (OuterVolumeSpecName: "scripts") pod "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2" (UID: "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.744886 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2" (UID: "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.748248 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2" (UID: "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.824942 4804 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.824987 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.825002 4804 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.825014 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85nm9\" (UniqueName: \"kubernetes.io/projected/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-kube-api-access-85nm9\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.825026 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.825037 4804 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.889820 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rzcfd-config-vlgpc"] Feb 17 13:45:57 crc kubenswrapper[4804]: W0217 13:45:57.893667 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bce6d8f_9e27_4d98_8003_f5e7b368f816.slice/crio-27f2fbab7c5e24fe5afe4faa5acef412e06fa89877469179c920ed1be23faded WatchSource:0}: Error finding container 27f2fbab7c5e24fe5afe4faa5acef412e06fa89877469179c920ed1be23faded: Status 404 returned error can't find the container with id 27f2fbab7c5e24fe5afe4faa5acef412e06fa89877469179c920ed1be23faded Feb 17 13:45:58 crc kubenswrapper[4804]: I0217 13:45:58.067892 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 17 13:45:58 crc kubenswrapper[4804]: W0217 13:45:58.079611 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90da6e89_6033_4e42_a5ca_bed1a5ad6a46.slice/crio-4a6213d89aa08a99f9a0d8e92cb7ba034c19edd77c5b9177dedbd870a0911013 WatchSource:0}: Error finding container 4a6213d89aa08a99f9a0d8e92cb7ba034c19edd77c5b9177dedbd870a0911013: Status 404 returned error can't find the container with id 4a6213d89aa08a99f9a0d8e92cb7ba034c19edd77c5b9177dedbd870a0911013 Feb 17 13:45:58 crc kubenswrapper[4804]: I0217 13:45:58.144821 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"4a6213d89aa08a99f9a0d8e92cb7ba034c19edd77c5b9177dedbd870a0911013"} Feb 17 13:45:58 crc kubenswrapper[4804]: I0217 13:45:58.147293 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lpd9f" event={"ID":"dfb6c8ec-f280-4566-bb37-b286119956b5","Type":"ContainerStarted","Data":"95b7f32cb6985d65e04882b6a57442ea7ebdd3da00100304c3a217e8d0730df3"} Feb 17 13:45:58 crc kubenswrapper[4804]: I0217 13:45:58.151519 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rzcfd-config-vlgpc" event={"ID":"5bce6d8f-9e27-4d98-8003-f5e7b368f816","Type":"ContainerStarted","Data":"27f2fbab7c5e24fe5afe4faa5acef412e06fa89877469179c920ed1be23faded"} Feb 17 13:45:58 crc kubenswrapper[4804]: I0217 13:45:58.155332 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mv8w5" event={"ID":"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2","Type":"ContainerDied","Data":"a563f1cc47989fe64effde3ad5ba60476f9907f0b572451fd849f90b4a6e4fa8"} Feb 17 13:45:58 crc kubenswrapper[4804]: I0217 13:45:58.155354 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:58 crc kubenswrapper[4804]: I0217 13:45:58.155376 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a563f1cc47989fe64effde3ad5ba60476f9907f0b572451fd849f90b4a6e4fa8" Feb 17 13:45:58 crc kubenswrapper[4804]: I0217 13:45:58.157845 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7705a06d-bc27-4686-9ca4-4aae248ead07","Type":"ContainerStarted","Data":"d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3"} Feb 17 13:45:58 crc kubenswrapper[4804]: I0217 13:45:58.158055 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 17 13:45:58 crc kubenswrapper[4804]: I0217 13:45:58.169944 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-lpd9f" podStartSLOduration=2.399252946 podStartE2EDuration="14.169924596s" podCreationTimestamp="2026-02-17 13:45:44 +0000 UTC" firstStartedPulling="2026-02-17 13:45:45.718447209 +0000 UTC m=+1219.829866546" lastFinishedPulling="2026-02-17 13:45:57.489118849 +0000 UTC m=+1231.600538196" observedRunningTime="2026-02-17 13:45:58.167780779 +0000 UTC m=+1232.279200116" watchObservedRunningTime="2026-02-17 13:45:58.169924596 +0000 UTC m=+1232.281343933" Feb 17 13:45:58 crc kubenswrapper[4804]: I0217 13:45:58.193744 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371967.66105 podStartE2EDuration="1m9.193725193s" podCreationTimestamp="2026-02-17 13:44:49 +0000 UTC" firstStartedPulling="2026-02-17 13:44:52.039880363 +0000 UTC m=+1166.151299700" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:45:58.187406514 +0000 UTC m=+1232.298825861" watchObservedRunningTime="2026-02-17 13:45:58.193725193 +0000 UTC m=+1232.305144530" Feb 17 13:45:59 crc kubenswrapper[4804]: I0217 13:45:59.168015 4804 generic.go:334] "Generic (PLEG): container finished" podID="5bce6d8f-9e27-4d98-8003-f5e7b368f816" containerID="c204297abebd9a53145ab03c24cc8848ddb7478ea7164daa834f5efc7f82083d" exitCode=0 Feb 17 13:45:59 crc kubenswrapper[4804]: I0217 13:45:59.168121 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rzcfd-config-vlgpc" event={"ID":"5bce6d8f-9e27-4d98-8003-f5e7b368f816","Type":"ContainerDied","Data":"c204297abebd9a53145ab03c24cc8848ddb7478ea7164daa834f5efc7f82083d"} Feb 17 13:45:59 crc kubenswrapper[4804]: I0217 13:45:59.534859 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-rzcfd" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.178984 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"5e0418dbdb94699ad2d329e623c23318e9ad1a1365dbd4cfbe0edb34b9c66c02"} Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.179365 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"b6f4a6092962720efac5162bdeaa71212bcf64d18f13f7730dc09cfc8dd63143"} Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.179381 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"fd27f69001d22a9707a7ed63ebbfea2eb121d3fa41a19780099c31844ef8b617"} Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.179393 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"f6f8139dd06cc10cc071d3c4100e7a1764d859000d5a92c2c2bfd6734820d392"} Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.411606 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.474903 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run-ovn\") pod \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.474997 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-scripts\") pod \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.475040 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run\") pod \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.475092 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-log-ovn\") pod \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.474997 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5bce6d8f-9e27-4d98-8003-f5e7b368f816" (UID: "5bce6d8f-9e27-4d98-8003-f5e7b368f816"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.475159 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgdkn\" (UniqueName: \"kubernetes.io/projected/5bce6d8f-9e27-4d98-8003-f5e7b368f816-kube-api-access-qgdkn\") pod \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.475242 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5bce6d8f-9e27-4d98-8003-f5e7b368f816" (UID: "5bce6d8f-9e27-4d98-8003-f5e7b368f816"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.475265 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-additional-scripts\") pod \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.475189 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run" (OuterVolumeSpecName: "var-run") pod "5bce6d8f-9e27-4d98-8003-f5e7b368f816" (UID: "5bce6d8f-9e27-4d98-8003-f5e7b368f816"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.475660 4804 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.475673 4804 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.475681 4804 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.475894 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "5bce6d8f-9e27-4d98-8003-f5e7b368f816" (UID: "5bce6d8f-9e27-4d98-8003-f5e7b368f816"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.476009 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-scripts" (OuterVolumeSpecName: "scripts") pod "5bce6d8f-9e27-4d98-8003-f5e7b368f816" (UID: "5bce6d8f-9e27-4d98-8003-f5e7b368f816"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.479679 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bce6d8f-9e27-4d98-8003-f5e7b368f816-kube-api-access-qgdkn" (OuterVolumeSpecName: "kube-api-access-qgdkn") pod "5bce6d8f-9e27-4d98-8003-f5e7b368f816" (UID: "5bce6d8f-9e27-4d98-8003-f5e7b368f816"). InnerVolumeSpecName "kube-api-access-qgdkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.577060 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgdkn\" (UniqueName: \"kubernetes.io/projected/5bce6d8f-9e27-4d98-8003-f5e7b368f816-kube-api-access-qgdkn\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.577098 4804 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.577111 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.192885 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"f9de6394a041e5bc5915e2c672f2446d62b1bebdb25675d7caab649a201be2b3"} Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.196197 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rzcfd-config-vlgpc" event={"ID":"5bce6d8f-9e27-4d98-8003-f5e7b368f816","Type":"ContainerDied","Data":"27f2fbab7c5e24fe5afe4faa5acef412e06fa89877469179c920ed1be23faded"} Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.196260 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27f2fbab7c5e24fe5afe4faa5acef412e06fa89877469179c920ed1be23faded" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.196286 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.561317 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-rzcfd-config-vlgpc"] Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.572655 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-rzcfd-config-vlgpc"] Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.628384 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.663249 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rzcfd-config-2z52q"] Feb 17 13:46:01 crc kubenswrapper[4804]: E0217 13:46:01.663739 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2" containerName="swift-ring-rebalance" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.663763 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2" containerName="swift-ring-rebalance" Feb 17 13:46:01 crc kubenswrapper[4804]: E0217 13:46:01.663802 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bce6d8f-9e27-4d98-8003-f5e7b368f816" containerName="ovn-config" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.663812 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bce6d8f-9e27-4d98-8003-f5e7b368f816" containerName="ovn-config" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.664013 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bce6d8f-9e27-4d98-8003-f5e7b368f816" containerName="ovn-config" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.664041 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2" containerName="swift-ring-rebalance" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.666303 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.669378 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.703068 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rzcfd-config-2z52q"] Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.802258 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-additional-scripts\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.802316 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-scripts\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.802346 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run-ovn\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.802411 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-log-ovn\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.802472 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bk2z\" (UniqueName: \"kubernetes.io/projected/320d7daa-75d5-47da-8895-e49aa4bdbd01-kube-api-access-9bk2z\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.802518 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.903996 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-log-ovn\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.904082 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bk2z\" (UniqueName: \"kubernetes.io/projected/320d7daa-75d5-47da-8895-e49aa4bdbd01-kube-api-access-9bk2z\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.904122 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.904179 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-additional-scripts\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.904218 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-scripts\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.904236 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run-ovn\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.904488 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run-ovn\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.904544 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-log-ovn\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.904859 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.905482 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-additional-scripts\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.913923 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-scripts\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.925806 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bk2z\" (UniqueName: \"kubernetes.io/projected/320d7daa-75d5-47da-8895-e49aa4bdbd01-kube-api-access-9bk2z\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.987225 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:02 crc kubenswrapper[4804]: I0217 13:46:02.212890 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"686e2cfe93aa9073aa5c053faedb8c10fef0f4ced630f109a9553ace832d1d80"} Feb 17 13:46:02 crc kubenswrapper[4804]: I0217 13:46:02.213307 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"1da734f3e094b44514d0b3c69a6a00f8a1495e89176bb397c8fe57eeacf4ad38"} Feb 17 13:46:02 crc kubenswrapper[4804]: I0217 13:46:02.485146 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rzcfd-config-2z52q"] Feb 17 13:46:02 crc kubenswrapper[4804]: I0217 13:46:02.588621 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bce6d8f-9e27-4d98-8003-f5e7b368f816" path="/var/lib/kubelet/pods/5bce6d8f-9e27-4d98-8003-f5e7b368f816/volumes" Feb 17 13:46:03 crc kubenswrapper[4804]: I0217 13:46:03.221890 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rzcfd-config-2z52q" event={"ID":"320d7daa-75d5-47da-8895-e49aa4bdbd01","Type":"ContainerStarted","Data":"68ee1b720b46b359f54d44cc86dbc4b93cbd1ca1b6ba26986e9fe10248b66933"} Feb 17 13:46:04 crc kubenswrapper[4804]: I0217 13:46:04.240792 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rzcfd-config-2z52q" event={"ID":"320d7daa-75d5-47da-8895-e49aa4bdbd01","Type":"ContainerStarted","Data":"525c6762f9ba8180d2f6b437538441d8677513d8b708766e65a25901daeb816c"} Feb 17 13:46:05 crc kubenswrapper[4804]: I0217 13:46:05.251750 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"7bd65056ded46a7f415c5114a632dee5a15e7d850a90a8b84394e91556a340bc"} Feb 17 13:46:05 crc kubenswrapper[4804]: I0217 13:46:05.254527 4804 generic.go:334] "Generic (PLEG): container finished" podID="320d7daa-75d5-47da-8895-e49aa4bdbd01" containerID="525c6762f9ba8180d2f6b437538441d8677513d8b708766e65a25901daeb816c" exitCode=0 Feb 17 13:46:05 crc kubenswrapper[4804]: I0217 13:46:05.254561 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rzcfd-config-2z52q" event={"ID":"320d7daa-75d5-47da-8895-e49aa4bdbd01","Type":"ContainerDied","Data":"525c6762f9ba8180d2f6b437538441d8677513d8b708766e65a25901daeb816c"} Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.273595 4804 generic.go:334] "Generic (PLEG): container finished" podID="dfb6c8ec-f280-4566-bb37-b286119956b5" containerID="95b7f32cb6985d65e04882b6a57442ea7ebdd3da00100304c3a217e8d0730df3" exitCode=0 Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.273814 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lpd9f" event={"ID":"dfb6c8ec-f280-4566-bb37-b286119956b5","Type":"ContainerDied","Data":"95b7f32cb6985d65e04882b6a57442ea7ebdd3da00100304c3a217e8d0730df3"} Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.294173 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"22f6ed363823056fbc0fd4a7c4cfe3d602bfcf8bfc5e65a12a3cde9a7e4b9bc6"} Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.294234 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"f202da3b35bdf2ec3ead0862da3a8cbdee11edb5799c47d9b747d39fabd758c1"} Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.294247 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"0cb79acfd998f6c023daac6db3e211de72172d9ea7ee223aeab1f68d231801de"} Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.294257 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"93bebab89c543415262fbcca547ec300f432db9d85f2f1f08f1bb4d91eaa9893"} Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.294267 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"fe1ecd36edd66ab9c042fcd2efeb8282870b6925b9fa54394bcdfd94212a4bdb"} Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.294277 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"91e9eadda5958c02bad315ad2fa1dc5c0fb9327307643e907256269aea3a4d1f"} Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.640996 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.681785 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-additional-scripts\") pod \"320d7daa-75d5-47da-8895-e49aa4bdbd01\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.681936 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run-ovn\") pod \"320d7daa-75d5-47da-8895-e49aa4bdbd01\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.681978 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run\") pod \"320d7daa-75d5-47da-8895-e49aa4bdbd01\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.682067 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bk2z\" (UniqueName: \"kubernetes.io/projected/320d7daa-75d5-47da-8895-e49aa4bdbd01-kube-api-access-9bk2z\") pod \"320d7daa-75d5-47da-8895-e49aa4bdbd01\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.682238 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-scripts\") pod \"320d7daa-75d5-47da-8895-e49aa4bdbd01\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.682310 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-log-ovn\") pod \"320d7daa-75d5-47da-8895-e49aa4bdbd01\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.682557 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run" (OuterVolumeSpecName: "var-run") pod "320d7daa-75d5-47da-8895-e49aa4bdbd01" (UID: "320d7daa-75d5-47da-8895-e49aa4bdbd01"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.682619 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "320d7daa-75d5-47da-8895-e49aa4bdbd01" (UID: "320d7daa-75d5-47da-8895-e49aa4bdbd01"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.683056 4804 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.683088 4804 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.683088 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "320d7daa-75d5-47da-8895-e49aa4bdbd01" (UID: "320d7daa-75d5-47da-8895-e49aa4bdbd01"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.683666 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "320d7daa-75d5-47da-8895-e49aa4bdbd01" (UID: "320d7daa-75d5-47da-8895-e49aa4bdbd01"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.684686 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-scripts" (OuterVolumeSpecName: "scripts") pod "320d7daa-75d5-47da-8895-e49aa4bdbd01" (UID: "320d7daa-75d5-47da-8895-e49aa4bdbd01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.686711 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/320d7daa-75d5-47da-8895-e49aa4bdbd01-kube-api-access-9bk2z" (OuterVolumeSpecName: "kube-api-access-9bk2z") pod "320d7daa-75d5-47da-8895-e49aa4bdbd01" (UID: "320d7daa-75d5-47da-8895-e49aa4bdbd01"). InnerVolumeSpecName "kube-api-access-9bk2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.784831 4804 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.784863 4804 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.784878 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bk2z\" (UniqueName: \"kubernetes.io/projected/320d7daa-75d5-47da-8895-e49aa4bdbd01-kube-api-access-9bk2z\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.784889 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.309234 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"bb4ff7d1735d0f3e000d70765610bd98df2ee0015c339b7c178e631d7b1ad325"} Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.310827 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.310833 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rzcfd-config-2z52q" event={"ID":"320d7daa-75d5-47da-8895-e49aa4bdbd01","Type":"ContainerDied","Data":"68ee1b720b46b359f54d44cc86dbc4b93cbd1ca1b6ba26986e9fe10248b66933"} Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.310877 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68ee1b720b46b359f54d44cc86dbc4b93cbd1ca1b6ba26986e9fe10248b66933" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.344844 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=24.254387256 podStartE2EDuration="31.344828247s" podCreationTimestamp="2026-02-17 13:45:36 +0000 UTC" firstStartedPulling="2026-02-17 13:45:58.086506392 +0000 UTC m=+1232.197925729" lastFinishedPulling="2026-02-17 13:46:05.176947383 +0000 UTC m=+1239.288366720" observedRunningTime="2026-02-17 13:46:07.342859175 +0000 UTC m=+1241.454278512" watchObservedRunningTime="2026-02-17 13:46:07.344828247 +0000 UTC m=+1241.456247584" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.636907 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-m4fkl"] Feb 17 13:46:07 crc kubenswrapper[4804]: E0217 13:46:07.637541 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320d7daa-75d5-47da-8895-e49aa4bdbd01" containerName="ovn-config" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.637557 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="320d7daa-75d5-47da-8895-e49aa4bdbd01" containerName="ovn-config" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.637696 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="320d7daa-75d5-47da-8895-e49aa4bdbd01" containerName="ovn-config" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.638476 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.640716 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.645605 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-m4fkl"] Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.700892 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-config\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.701015 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.701144 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.701177 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.701271 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vplc\" (UniqueName: \"kubernetes.io/projected/e1f0a7c0-6169-479c-ac5c-9a30f7619603-kube-api-access-7vplc\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.701293 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.721005 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-rzcfd-config-2z52q"] Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.727674 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-rzcfd-config-2z52q"] Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.785993 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lpd9f" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.802449 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vplc\" (UniqueName: \"kubernetes.io/projected/e1f0a7c0-6169-479c-ac5c-9a30f7619603-kube-api-access-7vplc\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.802497 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.802545 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-config\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.802612 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.802642 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.802658 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.803438 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.803478 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-config\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.803482 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.803607 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.804034 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.832442 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vplc\" (UniqueName: \"kubernetes.io/projected/e1f0a7c0-6169-479c-ac5c-9a30f7619603-kube-api-access-7vplc\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.904281 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-combined-ca-bundle\") pod \"dfb6c8ec-f280-4566-bb37-b286119956b5\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.904437 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-config-data\") pod \"dfb6c8ec-f280-4566-bb37-b286119956b5\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.904565 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-db-sync-config-data\") pod \"dfb6c8ec-f280-4566-bb37-b286119956b5\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.904591 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkcbn\" (UniqueName: \"kubernetes.io/projected/dfb6c8ec-f280-4566-bb37-b286119956b5-kube-api-access-zkcbn\") pod \"dfb6c8ec-f280-4566-bb37-b286119956b5\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.910120 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "dfb6c8ec-f280-4566-bb37-b286119956b5" (UID: "dfb6c8ec-f280-4566-bb37-b286119956b5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.910931 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfb6c8ec-f280-4566-bb37-b286119956b5-kube-api-access-zkcbn" (OuterVolumeSpecName: "kube-api-access-zkcbn") pod "dfb6c8ec-f280-4566-bb37-b286119956b5" (UID: "dfb6c8ec-f280-4566-bb37-b286119956b5"). InnerVolumeSpecName "kube-api-access-zkcbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.935704 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfb6c8ec-f280-4566-bb37-b286119956b5" (UID: "dfb6c8ec-f280-4566-bb37-b286119956b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.945913 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-config-data" (OuterVolumeSpecName: "config-data") pod "dfb6c8ec-f280-4566-bb37-b286119956b5" (UID: "dfb6c8ec-f280-4566-bb37-b286119956b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.964597 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.006306 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.006341 4804 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.006354 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkcbn\" (UniqueName: \"kubernetes.io/projected/dfb6c8ec-f280-4566-bb37-b286119956b5-kube-api-access-zkcbn\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.006367 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.326891 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lpd9f" event={"ID":"dfb6c8ec-f280-4566-bb37-b286119956b5","Type":"ContainerDied","Data":"8b44ab8e952e1f3a0a808b705734302f9d0fba531b5c5b5df0002ed2ff150b18"} Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.327291 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b44ab8e952e1f3a0a808b705734302f9d0fba531b5c5b5df0002ed2ff150b18" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.327007 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lpd9f" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.423668 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-m4fkl"] Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.612673 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="320d7daa-75d5-47da-8895-e49aa4bdbd01" path="/var/lib/kubelet/pods/320d7daa-75d5-47da-8895-e49aa4bdbd01/volumes" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.765322 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-m4fkl"] Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.848807 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-hp2db"] Feb 17 13:46:08 crc kubenswrapper[4804]: E0217 13:46:08.849247 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfb6c8ec-f280-4566-bb37-b286119956b5" containerName="glance-db-sync" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.849273 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfb6c8ec-f280-4566-bb37-b286119956b5" containerName="glance-db-sync" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.849483 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfb6c8ec-f280-4566-bb37-b286119956b5" containerName="glance-db-sync" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.850727 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.870674 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-hp2db"] Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.938847 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-config\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.938881 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.938922 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.938945 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.939177 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg5rm\" (UniqueName: \"kubernetes.io/projected/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-kube-api-access-gg5rm\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.939264 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.040673 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg5rm\" (UniqueName: \"kubernetes.io/projected/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-kube-api-access-gg5rm\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.040717 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.040791 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-config\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.040809 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.040841 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.040864 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.041767 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.041868 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-config\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.041877 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.042096 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.042812 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.070173 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg5rm\" (UniqueName: \"kubernetes.io/projected/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-kube-api-access-gg5rm\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.168463 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.336548 4804 generic.go:334] "Generic (PLEG): container finished" podID="e1f0a7c0-6169-479c-ac5c-9a30f7619603" containerID="e097e65733863a0ea477698b59924cd597cb6c636bea03eec20a7fcebf703c21" exitCode=0 Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.336701 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" event={"ID":"e1f0a7c0-6169-479c-ac5c-9a30f7619603","Type":"ContainerDied","Data":"e097e65733863a0ea477698b59924cd597cb6c636bea03eec20a7fcebf703c21"} Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.336867 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" event={"ID":"e1f0a7c0-6169-479c-ac5c-9a30f7619603","Type":"ContainerStarted","Data":"1a11ba1d0a8c306ea4b2a4f940ad10c27214aa9a1c4f3dea92204634530ef96a"} Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.618080 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-hp2db"] Feb 17 13:46:09 crc kubenswrapper[4804]: W0217 13:46:09.618337 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd29250cb_6c2b_4994_ba6f_f3b7239ec3e2.slice/crio-9e9b5616dd62b1afbb31c7b84604c193960d32aba2443b3713adaf3e69d9332f WatchSource:0}: Error finding container 9e9b5616dd62b1afbb31c7b84604c193960d32aba2443b3713adaf3e69d9332f: Status 404 returned error can't find the container with id 9e9b5616dd62b1afbb31c7b84604c193960d32aba2443b3713adaf3e69d9332f Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.761500 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.856775 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-config\") pod \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.856888 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-swift-storage-0\") pod \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.856917 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-nb\") pod \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.856982 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vplc\" (UniqueName: \"kubernetes.io/projected/e1f0a7c0-6169-479c-ac5c-9a30f7619603-kube-api-access-7vplc\") pod \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.857006 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-sb\") pod \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.857104 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-svc\") pod \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.861681 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1f0a7c0-6169-479c-ac5c-9a30f7619603-kube-api-access-7vplc" (OuterVolumeSpecName: "kube-api-access-7vplc") pod "e1f0a7c0-6169-479c-ac5c-9a30f7619603" (UID: "e1f0a7c0-6169-479c-ac5c-9a30f7619603"). InnerVolumeSpecName "kube-api-access-7vplc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.878450 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e1f0a7c0-6169-479c-ac5c-9a30f7619603" (UID: "e1f0a7c0-6169-479c-ac5c-9a30f7619603"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.879387 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e1f0a7c0-6169-479c-ac5c-9a30f7619603" (UID: "e1f0a7c0-6169-479c-ac5c-9a30f7619603"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.880012 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-config" (OuterVolumeSpecName: "config") pod "e1f0a7c0-6169-479c-ac5c-9a30f7619603" (UID: "e1f0a7c0-6169-479c-ac5c-9a30f7619603"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.881347 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e1f0a7c0-6169-479c-ac5c-9a30f7619603" (UID: "e1f0a7c0-6169-479c-ac5c-9a30f7619603"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.882550 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e1f0a7c0-6169-479c-ac5c-9a30f7619603" (UID: "e1f0a7c0-6169-479c-ac5c-9a30f7619603"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.959577 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.959609 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.959619 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.959630 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.959639 4804 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.959649 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vplc\" (UniqueName: \"kubernetes.io/projected/e1f0a7c0-6169-479c-ac5c-9a30f7619603-kube-api-access-7vplc\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:10 crc kubenswrapper[4804]: I0217 13:46:10.346980 4804 generic.go:334] "Generic (PLEG): container finished" podID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerID="db5a3b86c0d8b3db5d6271f9217c22ad17bdcc258a7e248a0fd7a959c200bb06" exitCode=0 Feb 17 13:46:10 crc kubenswrapper[4804]: I0217 13:46:10.347043 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" event={"ID":"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2","Type":"ContainerDied","Data":"db5a3b86c0d8b3db5d6271f9217c22ad17bdcc258a7e248a0fd7a959c200bb06"} Feb 17 13:46:10 crc kubenswrapper[4804]: I0217 13:46:10.347094 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" event={"ID":"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2","Type":"ContainerStarted","Data":"9e9b5616dd62b1afbb31c7b84604c193960d32aba2443b3713adaf3e69d9332f"} Feb 17 13:46:10 crc kubenswrapper[4804]: I0217 13:46:10.349346 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" event={"ID":"e1f0a7c0-6169-479c-ac5c-9a30f7619603","Type":"ContainerDied","Data":"1a11ba1d0a8c306ea4b2a4f940ad10c27214aa9a1c4f3dea92204634530ef96a"} Feb 17 13:46:10 crc kubenswrapper[4804]: I0217 13:46:10.349388 4804 scope.go:117] "RemoveContainer" containerID="e097e65733863a0ea477698b59924cd597cb6c636bea03eec20a7fcebf703c21" Feb 17 13:46:10 crc kubenswrapper[4804]: I0217 13:46:10.349465 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:10 crc kubenswrapper[4804]: I0217 13:46:10.570672 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-m4fkl"] Feb 17 13:46:10 crc kubenswrapper[4804]: I0217 13:46:10.587077 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-m4fkl"] Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.358607 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" event={"ID":"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2","Type":"ContainerStarted","Data":"d82f8e60d688c7c01688fbedc29bdcd643db8c569309612415da050dc9220d5f"} Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.358801 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.385372 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" podStartSLOduration=3.3853435530000002 podStartE2EDuration="3.385343553s" podCreationTimestamp="2026-02-17 13:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:46:11.37760546 +0000 UTC m=+1245.489024807" watchObservedRunningTime="2026-02-17 13:46:11.385343553 +0000 UTC m=+1245.496762900" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.500436 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.800345 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-ncwmc"] Feb 17 13:46:11 crc kubenswrapper[4804]: E0217 13:46:11.801082 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1f0a7c0-6169-479c-ac5c-9a30f7619603" containerName="init" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.801100 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f0a7c0-6169-479c-ac5c-9a30f7619603" containerName="init" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.801314 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1f0a7c0-6169-479c-ac5c-9a30f7619603" containerName="init" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.801962 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ncwmc" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.825974 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-ncwmc"] Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.893335 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4895769c-ef45-40c8-a8ae-0c5cb954dab2-operator-scripts\") pod \"cinder-db-create-ncwmc\" (UID: \"4895769c-ef45-40c8-a8ae-0c5cb954dab2\") " pod="openstack/cinder-db-create-ncwmc" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.893406 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmllm\" (UniqueName: \"kubernetes.io/projected/4895769c-ef45-40c8-a8ae-0c5cb954dab2-kube-api-access-nmllm\") pod \"cinder-db-create-ncwmc\" (UID: \"4895769c-ef45-40c8-a8ae-0c5cb954dab2\") " pod="openstack/cinder-db-create-ncwmc" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.903759 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-46zbc"] Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.920361 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-46zbc" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.927193 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-d59c-account-create-update-phgft"] Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.928316 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-46zbc"] Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.928417 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d59c-account-create-update-phgft" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.959583 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.972399 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d59c-account-create-update-phgft"] Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.995042 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-operator-scripts\") pod \"cinder-d59c-account-create-update-phgft\" (UID: \"35f1cc1f-a736-4c02-9c26-726c0c6f0d59\") " pod="openstack/cinder-d59c-account-create-update-phgft" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.995092 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt87n\" (UniqueName: \"kubernetes.io/projected/60ee8426-dcbf-4430-8594-68ee778a8bbc-kube-api-access-rt87n\") pod \"barbican-db-create-46zbc\" (UID: \"60ee8426-dcbf-4430-8594-68ee778a8bbc\") " pod="openstack/barbican-db-create-46zbc" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.995162 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvb4p\" (UniqueName: \"kubernetes.io/projected/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-kube-api-access-wvb4p\") pod \"cinder-d59c-account-create-update-phgft\" (UID: \"35f1cc1f-a736-4c02-9c26-726c0c6f0d59\") " pod="openstack/cinder-d59c-account-create-update-phgft" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.995217 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4895769c-ef45-40c8-a8ae-0c5cb954dab2-operator-scripts\") pod \"cinder-db-create-ncwmc\" (UID: \"4895769c-ef45-40c8-a8ae-0c5cb954dab2\") " pod="openstack/cinder-db-create-ncwmc" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.995262 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmllm\" (UniqueName: \"kubernetes.io/projected/4895769c-ef45-40c8-a8ae-0c5cb954dab2-kube-api-access-nmllm\") pod \"cinder-db-create-ncwmc\" (UID: \"4895769c-ef45-40c8-a8ae-0c5cb954dab2\") " pod="openstack/cinder-db-create-ncwmc" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.995374 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60ee8426-dcbf-4430-8594-68ee778a8bbc-operator-scripts\") pod \"barbican-db-create-46zbc\" (UID: \"60ee8426-dcbf-4430-8594-68ee778a8bbc\") " pod="openstack/barbican-db-create-46zbc" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.996334 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4895769c-ef45-40c8-a8ae-0c5cb954dab2-operator-scripts\") pod \"cinder-db-create-ncwmc\" (UID: \"4895769c-ef45-40c8-a8ae-0c5cb954dab2\") " pod="openstack/cinder-db-create-ncwmc" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.023892 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmllm\" (UniqueName: \"kubernetes.io/projected/4895769c-ef45-40c8-a8ae-0c5cb954dab2-kube-api-access-nmllm\") pod \"cinder-db-create-ncwmc\" (UID: \"4895769c-ef45-40c8-a8ae-0c5cb954dab2\") " pod="openstack/cinder-db-create-ncwmc" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.096645 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60ee8426-dcbf-4430-8594-68ee778a8bbc-operator-scripts\") pod \"barbican-db-create-46zbc\" (UID: \"60ee8426-dcbf-4430-8594-68ee778a8bbc\") " pod="openstack/barbican-db-create-46zbc" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.096716 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-operator-scripts\") pod \"cinder-d59c-account-create-update-phgft\" (UID: \"35f1cc1f-a736-4c02-9c26-726c0c6f0d59\") " pod="openstack/cinder-d59c-account-create-update-phgft" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.096747 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt87n\" (UniqueName: \"kubernetes.io/projected/60ee8426-dcbf-4430-8594-68ee778a8bbc-kube-api-access-rt87n\") pod \"barbican-db-create-46zbc\" (UID: \"60ee8426-dcbf-4430-8594-68ee778a8bbc\") " pod="openstack/barbican-db-create-46zbc" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.096811 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvb4p\" (UniqueName: \"kubernetes.io/projected/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-kube-api-access-wvb4p\") pod \"cinder-d59c-account-create-update-phgft\" (UID: \"35f1cc1f-a736-4c02-9c26-726c0c6f0d59\") " pod="openstack/cinder-d59c-account-create-update-phgft" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.098094 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60ee8426-dcbf-4430-8594-68ee778a8bbc-operator-scripts\") pod \"barbican-db-create-46zbc\" (UID: \"60ee8426-dcbf-4430-8594-68ee778a8bbc\") " pod="openstack/barbican-db-create-46zbc" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.098807 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-operator-scripts\") pod \"cinder-d59c-account-create-update-phgft\" (UID: \"35f1cc1f-a736-4c02-9c26-726c0c6f0d59\") " pod="openstack/cinder-d59c-account-create-update-phgft" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.112025 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-7982-account-create-update-pd5b7"] Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.113308 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7982-account-create-update-pd5b7" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.116184 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ncwmc" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.121418 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.122616 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt87n\" (UniqueName: \"kubernetes.io/projected/60ee8426-dcbf-4430-8594-68ee778a8bbc-kube-api-access-rt87n\") pod \"barbican-db-create-46zbc\" (UID: \"60ee8426-dcbf-4430-8594-68ee778a8bbc\") " pod="openstack/barbican-db-create-46zbc" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.123584 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-7982-account-create-update-pd5b7"] Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.124752 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvb4p\" (UniqueName: \"kubernetes.io/projected/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-kube-api-access-wvb4p\") pod \"cinder-d59c-account-create-update-phgft\" (UID: \"35f1cc1f-a736-4c02-9c26-726c0c6f0d59\") " pod="openstack/cinder-d59c-account-create-update-phgft" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.161924 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-dgzbs"] Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.163175 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.166457 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2fq28" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.166747 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.166953 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.167849 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.178275 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dgzbs"] Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.202749 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-config-data\") pod \"keystone-db-sync-dgzbs\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.202813 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gwpw\" (UniqueName: \"kubernetes.io/projected/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-kube-api-access-2gwpw\") pod \"barbican-7982-account-create-update-pd5b7\" (UID: \"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc\") " pod="openstack/barbican-7982-account-create-update-pd5b7" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.202871 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn9kw\" (UniqueName: \"kubernetes.io/projected/fd9036c7-1cff-4fb8-9af2-90057c4251dc-kube-api-access-rn9kw\") pod \"keystone-db-sync-dgzbs\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.202953 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-operator-scripts\") pod \"barbican-7982-account-create-update-pd5b7\" (UID: \"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc\") " pod="openstack/barbican-7982-account-create-update-pd5b7" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.202974 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-combined-ca-bundle\") pod \"keystone-db-sync-dgzbs\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.226265 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-hdmw5"] Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.227425 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hdmw5" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.246099 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hdmw5"] Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.284766 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-46zbc" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.292002 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d59c-account-create-update-phgft" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.304769 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn9kw\" (UniqueName: \"kubernetes.io/projected/fd9036c7-1cff-4fb8-9af2-90057c4251dc-kube-api-access-rn9kw\") pod \"keystone-db-sync-dgzbs\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.304883 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-operator-scripts\") pod \"barbican-7982-account-create-update-pd5b7\" (UID: \"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc\") " pod="openstack/barbican-7982-account-create-update-pd5b7" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.304915 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-combined-ca-bundle\") pod \"keystone-db-sync-dgzbs\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.304944 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsf6h\" (UniqueName: \"kubernetes.io/projected/e26c9257-7102-4d48-8999-c0a3f0ca4009-kube-api-access-zsf6h\") pod \"neutron-db-create-hdmw5\" (UID: \"e26c9257-7102-4d48-8999-c0a3f0ca4009\") " pod="openstack/neutron-db-create-hdmw5" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.304980 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e26c9257-7102-4d48-8999-c0a3f0ca4009-operator-scripts\") pod \"neutron-db-create-hdmw5\" (UID: \"e26c9257-7102-4d48-8999-c0a3f0ca4009\") " pod="openstack/neutron-db-create-hdmw5" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.305023 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-config-data\") pod \"keystone-db-sync-dgzbs\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.305814 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gwpw\" (UniqueName: \"kubernetes.io/projected/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-kube-api-access-2gwpw\") pod \"barbican-7982-account-create-update-pd5b7\" (UID: \"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc\") " pod="openstack/barbican-7982-account-create-update-pd5b7" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.306987 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-operator-scripts\") pod \"barbican-7982-account-create-update-pd5b7\" (UID: \"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc\") " pod="openstack/barbican-7982-account-create-update-pd5b7" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.310118 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-config-data\") pod \"keystone-db-sync-dgzbs\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.312970 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-combined-ca-bundle\") pod \"keystone-db-sync-dgzbs\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.326049 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gwpw\" (UniqueName: \"kubernetes.io/projected/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-kube-api-access-2gwpw\") pod \"barbican-7982-account-create-update-pd5b7\" (UID: \"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc\") " pod="openstack/barbican-7982-account-create-update-pd5b7" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.333830 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn9kw\" (UniqueName: \"kubernetes.io/projected/fd9036c7-1cff-4fb8-9af2-90057c4251dc-kube-api-access-rn9kw\") pod \"keystone-db-sync-dgzbs\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.408044 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsf6h\" (UniqueName: \"kubernetes.io/projected/e26c9257-7102-4d48-8999-c0a3f0ca4009-kube-api-access-zsf6h\") pod \"neutron-db-create-hdmw5\" (UID: \"e26c9257-7102-4d48-8999-c0a3f0ca4009\") " pod="openstack/neutron-db-create-hdmw5" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.408089 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e26c9257-7102-4d48-8999-c0a3f0ca4009-operator-scripts\") pod \"neutron-db-create-hdmw5\" (UID: \"e26c9257-7102-4d48-8999-c0a3f0ca4009\") " pod="openstack/neutron-db-create-hdmw5" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.408721 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e26c9257-7102-4d48-8999-c0a3f0ca4009-operator-scripts\") pod \"neutron-db-create-hdmw5\" (UID: \"e26c9257-7102-4d48-8999-c0a3f0ca4009\") " pod="openstack/neutron-db-create-hdmw5" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.432966 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsf6h\" (UniqueName: \"kubernetes.io/projected/e26c9257-7102-4d48-8999-c0a3f0ca4009-kube-api-access-zsf6h\") pod \"neutron-db-create-hdmw5\" (UID: \"e26c9257-7102-4d48-8999-c0a3f0ca4009\") " pod="openstack/neutron-db-create-hdmw5" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.508449 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-ncwmc"] Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.521052 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7982-account-create-update-pd5b7" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.537530 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.539724 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-98b2-account-create-update-648xj"] Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.540647 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-98b2-account-create-update-648xj" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.558597 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hdmw5" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.562468 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.571643 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-98b2-account-create-update-648xj"] Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.597810 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1f0a7c0-6169-479c-ac5c-9a30f7619603" path="/var/lib/kubelet/pods/e1f0a7c0-6169-479c-ac5c-9a30f7619603/volumes" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.613971 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td696\" (UniqueName: \"kubernetes.io/projected/26fadc7a-6cf8-4ea0-8609-50e585db4115-kube-api-access-td696\") pod \"neutron-98b2-account-create-update-648xj\" (UID: \"26fadc7a-6cf8-4ea0-8609-50e585db4115\") " pod="openstack/neutron-98b2-account-create-update-648xj" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.614347 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26fadc7a-6cf8-4ea0-8609-50e585db4115-operator-scripts\") pod \"neutron-98b2-account-create-update-648xj\" (UID: \"26fadc7a-6cf8-4ea0-8609-50e585db4115\") " pod="openstack/neutron-98b2-account-create-update-648xj" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.716619 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26fadc7a-6cf8-4ea0-8609-50e585db4115-operator-scripts\") pod \"neutron-98b2-account-create-update-648xj\" (UID: \"26fadc7a-6cf8-4ea0-8609-50e585db4115\") " pod="openstack/neutron-98b2-account-create-update-648xj" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.716746 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td696\" (UniqueName: \"kubernetes.io/projected/26fadc7a-6cf8-4ea0-8609-50e585db4115-kube-api-access-td696\") pod \"neutron-98b2-account-create-update-648xj\" (UID: \"26fadc7a-6cf8-4ea0-8609-50e585db4115\") " pod="openstack/neutron-98b2-account-create-update-648xj" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.717962 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26fadc7a-6cf8-4ea0-8609-50e585db4115-operator-scripts\") pod \"neutron-98b2-account-create-update-648xj\" (UID: \"26fadc7a-6cf8-4ea0-8609-50e585db4115\") " pod="openstack/neutron-98b2-account-create-update-648xj" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.741333 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td696\" (UniqueName: \"kubernetes.io/projected/26fadc7a-6cf8-4ea0-8609-50e585db4115-kube-api-access-td696\") pod \"neutron-98b2-account-create-update-648xj\" (UID: \"26fadc7a-6cf8-4ea0-8609-50e585db4115\") " pod="openstack/neutron-98b2-account-create-update-648xj" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.928585 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-98b2-account-create-update-648xj" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.946710 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d59c-account-create-update-phgft"] Feb 17 13:46:12 crc kubenswrapper[4804]: W0217 13:46:12.955944 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35f1cc1f_a736_4c02_9c26_726c0c6f0d59.slice/crio-98275f0e919ccf44d3043fc8e10d4f4fd1dfbb384a1679b9b4a328c739fa2680 WatchSource:0}: Error finding container 98275f0e919ccf44d3043fc8e10d4f4fd1dfbb384a1679b9b4a328c739fa2680: Status 404 returned error can't find the container with id 98275f0e919ccf44d3043fc8e10d4f4fd1dfbb384a1679b9b4a328c739fa2680 Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.049018 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-46zbc"] Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.140279 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-7982-account-create-update-pd5b7"] Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.246167 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hdmw5"] Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.322906 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dgzbs"] Feb 17 13:46:13 crc kubenswrapper[4804]: W0217 13:46:13.347686 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd9036c7_1cff_4fb8_9af2_90057c4251dc.slice/crio-1b6b5b01d9570776dd1a52c7a9cea46e9e30d04a83c7ce13cfd62e96901e77de WatchSource:0}: Error finding container 1b6b5b01d9570776dd1a52c7a9cea46e9e30d04a83c7ce13cfd62e96901e77de: Status 404 returned error can't find the container with id 1b6b5b01d9570776dd1a52c7a9cea46e9e30d04a83c7ce13cfd62e96901e77de Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.382418 4804 generic.go:334] "Generic (PLEG): container finished" podID="4895769c-ef45-40c8-a8ae-0c5cb954dab2" containerID="2af0e585925ef4ba3eb4997ba9a346fe72a20fb7f9f2943dcb04719e80a69278" exitCode=0 Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.382508 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ncwmc" event={"ID":"4895769c-ef45-40c8-a8ae-0c5cb954dab2","Type":"ContainerDied","Data":"2af0e585925ef4ba3eb4997ba9a346fe72a20fb7f9f2943dcb04719e80a69278"} Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.382564 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ncwmc" event={"ID":"4895769c-ef45-40c8-a8ae-0c5cb954dab2","Type":"ContainerStarted","Data":"c96f9b2d88b6adfa9b44dae8cb976cc639f7f821be9d2828db0153f577444bf0"} Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.383976 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-7982-account-create-update-pd5b7" event={"ID":"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc","Type":"ContainerStarted","Data":"584460ac40b379634789213fb9875e27bc44f0755fab8cd37c4bc1a2c224a708"} Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.386120 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dgzbs" event={"ID":"fd9036c7-1cff-4fb8-9af2-90057c4251dc","Type":"ContainerStarted","Data":"1b6b5b01d9570776dd1a52c7a9cea46e9e30d04a83c7ce13cfd62e96901e77de"} Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.387350 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hdmw5" event={"ID":"e26c9257-7102-4d48-8999-c0a3f0ca4009","Type":"ContainerStarted","Data":"ff80344ef1bac1d5d7fbda5968ed0c5c11a256e94e9d793d91dda8265547ba1a"} Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.388954 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-46zbc" event={"ID":"60ee8426-dcbf-4430-8594-68ee778a8bbc","Type":"ContainerStarted","Data":"9bdfcabbaf1ee1e250875698a377ab6bde8ce671649b12731771caa70ec454c1"} Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.388978 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-46zbc" event={"ID":"60ee8426-dcbf-4430-8594-68ee778a8bbc","Type":"ContainerStarted","Data":"601e52925bb957989c7f25f9d646d8656693a83832f601e09c2331054b955310"} Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.394666 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d59c-account-create-update-phgft" event={"ID":"35f1cc1f-a736-4c02-9c26-726c0c6f0d59","Type":"ContainerStarted","Data":"195eb227b4e35d11d8a48fcc419fb067302eb3196988b8e72eeeeeb8aa5a6e2e"} Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.394696 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d59c-account-create-update-phgft" event={"ID":"35f1cc1f-a736-4c02-9c26-726c0c6f0d59","Type":"ContainerStarted","Data":"98275f0e919ccf44d3043fc8e10d4f4fd1dfbb384a1679b9b4a328c739fa2680"} Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.417162 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-46zbc" podStartSLOduration=2.41713908 podStartE2EDuration="2.41713908s" podCreationTimestamp="2026-02-17 13:46:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:46:13.415972974 +0000 UTC m=+1247.527392311" watchObservedRunningTime="2026-02-17 13:46:13.41713908 +0000 UTC m=+1247.528558417" Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.444081 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-d59c-account-create-update-phgft" podStartSLOduration=2.444064177 podStartE2EDuration="2.444064177s" podCreationTimestamp="2026-02-17 13:46:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:46:13.438425539 +0000 UTC m=+1247.549844876" watchObservedRunningTime="2026-02-17 13:46:13.444064177 +0000 UTC m=+1247.555483514" Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.460855 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-98b2-account-create-update-648xj"] Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.411910 4804 generic.go:334] "Generic (PLEG): container finished" podID="e26c9257-7102-4d48-8999-c0a3f0ca4009" containerID="7dbf5f5d88a50f9cfadbbf6692ca887131d2b4df1c33d00e1f7267394ff4525b" exitCode=0 Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.412055 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hdmw5" event={"ID":"e26c9257-7102-4d48-8999-c0a3f0ca4009","Type":"ContainerDied","Data":"7dbf5f5d88a50f9cfadbbf6692ca887131d2b4df1c33d00e1f7267394ff4525b"} Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.413863 4804 generic.go:334] "Generic (PLEG): container finished" podID="60ee8426-dcbf-4430-8594-68ee778a8bbc" containerID="9bdfcabbaf1ee1e250875698a377ab6bde8ce671649b12731771caa70ec454c1" exitCode=0 Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.413947 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-46zbc" event={"ID":"60ee8426-dcbf-4430-8594-68ee778a8bbc","Type":"ContainerDied","Data":"9bdfcabbaf1ee1e250875698a377ab6bde8ce671649b12731771caa70ec454c1"} Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.415860 4804 generic.go:334] "Generic (PLEG): container finished" podID="35f1cc1f-a736-4c02-9c26-726c0c6f0d59" containerID="195eb227b4e35d11d8a48fcc419fb067302eb3196988b8e72eeeeeb8aa5a6e2e" exitCode=0 Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.415938 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d59c-account-create-update-phgft" event={"ID":"35f1cc1f-a736-4c02-9c26-726c0c6f0d59","Type":"ContainerDied","Data":"195eb227b4e35d11d8a48fcc419fb067302eb3196988b8e72eeeeeb8aa5a6e2e"} Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.417744 4804 generic.go:334] "Generic (PLEG): container finished" podID="26fadc7a-6cf8-4ea0-8609-50e585db4115" containerID="e16d35978c1a93f38aec046090d4bb89a7fa37eda37be7158b82151bac67e327" exitCode=0 Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.417808 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98b2-account-create-update-648xj" event={"ID":"26fadc7a-6cf8-4ea0-8609-50e585db4115","Type":"ContainerDied","Data":"e16d35978c1a93f38aec046090d4bb89a7fa37eda37be7158b82151bac67e327"} Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.417830 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98b2-account-create-update-648xj" event={"ID":"26fadc7a-6cf8-4ea0-8609-50e585db4115","Type":"ContainerStarted","Data":"86de1d053faa9d936fb63918e324800451303ce7017f7f3db74c27bc93776276"} Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.419408 4804 generic.go:334] "Generic (PLEG): container finished" podID="e64978ab-e30e-4ebf-bce0-a8e29d5e5adc" containerID="4a8cd13cbb3ba23bfa180f42dc167734c03b2d4bcdf0842db5532816b1f0b9bd" exitCode=0 Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.419461 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-7982-account-create-update-pd5b7" event={"ID":"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc","Type":"ContainerDied","Data":"4a8cd13cbb3ba23bfa180f42dc167734c03b2d4bcdf0842db5532816b1f0b9bd"} Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.741142 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ncwmc" Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.854420 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4895769c-ef45-40c8-a8ae-0c5cb954dab2-operator-scripts\") pod \"4895769c-ef45-40c8-a8ae-0c5cb954dab2\" (UID: \"4895769c-ef45-40c8-a8ae-0c5cb954dab2\") " Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.855124 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4895769c-ef45-40c8-a8ae-0c5cb954dab2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4895769c-ef45-40c8-a8ae-0c5cb954dab2" (UID: "4895769c-ef45-40c8-a8ae-0c5cb954dab2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.855505 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmllm\" (UniqueName: \"kubernetes.io/projected/4895769c-ef45-40c8-a8ae-0c5cb954dab2-kube-api-access-nmllm\") pod \"4895769c-ef45-40c8-a8ae-0c5cb954dab2\" (UID: \"4895769c-ef45-40c8-a8ae-0c5cb954dab2\") " Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.855920 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4895769c-ef45-40c8-a8ae-0c5cb954dab2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.866573 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4895769c-ef45-40c8-a8ae-0c5cb954dab2-kube-api-access-nmllm" (OuterVolumeSpecName: "kube-api-access-nmllm") pod "4895769c-ef45-40c8-a8ae-0c5cb954dab2" (UID: "4895769c-ef45-40c8-a8ae-0c5cb954dab2"). InnerVolumeSpecName "kube-api-access-nmllm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.957492 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmllm\" (UniqueName: \"kubernetes.io/projected/4895769c-ef45-40c8-a8ae-0c5cb954dab2-kube-api-access-nmllm\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:15 crc kubenswrapper[4804]: I0217 13:46:15.428837 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ncwmc" Feb 17 13:46:15 crc kubenswrapper[4804]: I0217 13:46:15.430284 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ncwmc" event={"ID":"4895769c-ef45-40c8-a8ae-0c5cb954dab2","Type":"ContainerDied","Data":"c96f9b2d88b6adfa9b44dae8cb976cc639f7f821be9d2828db0153f577444bf0"} Feb 17 13:46:15 crc kubenswrapper[4804]: I0217 13:46:15.430322 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c96f9b2d88b6adfa9b44dae8cb976cc639f7f821be9d2828db0153f577444bf0" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.273490 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-46zbc" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.280982 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7982-account-create-update-pd5b7" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.291928 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-98b2-account-create-update-648xj" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.295176 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d59c-account-create-update-phgft" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.311173 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hdmw5" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.316355 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-operator-scripts\") pod \"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc\" (UID: \"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc\") " Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.316625 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gwpw\" (UniqueName: \"kubernetes.io/projected/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-kube-api-access-2gwpw\") pod \"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc\" (UID: \"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc\") " Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.316808 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt87n\" (UniqueName: \"kubernetes.io/projected/60ee8426-dcbf-4430-8594-68ee778a8bbc-kube-api-access-rt87n\") pod \"60ee8426-dcbf-4430-8594-68ee778a8bbc\" (UID: \"60ee8426-dcbf-4430-8594-68ee778a8bbc\") " Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.317002 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60ee8426-dcbf-4430-8594-68ee778a8bbc-operator-scripts\") pod \"60ee8426-dcbf-4430-8594-68ee778a8bbc\" (UID: \"60ee8426-dcbf-4430-8594-68ee778a8bbc\") " Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.317157 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26fadc7a-6cf8-4ea0-8609-50e585db4115-operator-scripts\") pod \"26fadc7a-6cf8-4ea0-8609-50e585db4115\" (UID: \"26fadc7a-6cf8-4ea0-8609-50e585db4115\") " Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.317290 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-td696\" (UniqueName: \"kubernetes.io/projected/26fadc7a-6cf8-4ea0-8609-50e585db4115-kube-api-access-td696\") pod \"26fadc7a-6cf8-4ea0-8609-50e585db4115\" (UID: \"26fadc7a-6cf8-4ea0-8609-50e585db4115\") " Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.317292 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e64978ab-e30e-4ebf-bce0-a8e29d5e5adc" (UID: "e64978ab-e30e-4ebf-bce0-a8e29d5e5adc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.317582 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60ee8426-dcbf-4430-8594-68ee778a8bbc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "60ee8426-dcbf-4430-8594-68ee778a8bbc" (UID: "60ee8426-dcbf-4430-8594-68ee778a8bbc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.317745 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26fadc7a-6cf8-4ea0-8609-50e585db4115-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "26fadc7a-6cf8-4ea0-8609-50e585db4115" (UID: "26fadc7a-6cf8-4ea0-8609-50e585db4115"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.318154 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.318179 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60ee8426-dcbf-4430-8594-68ee778a8bbc-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.318193 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26fadc7a-6cf8-4ea0-8609-50e585db4115-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.324702 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60ee8426-dcbf-4430-8594-68ee778a8bbc-kube-api-access-rt87n" (OuterVolumeSpecName: "kube-api-access-rt87n") pod "60ee8426-dcbf-4430-8594-68ee778a8bbc" (UID: "60ee8426-dcbf-4430-8594-68ee778a8bbc"). InnerVolumeSpecName "kube-api-access-rt87n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.327736 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-kube-api-access-2gwpw" (OuterVolumeSpecName: "kube-api-access-2gwpw") pod "e64978ab-e30e-4ebf-bce0-a8e29d5e5adc" (UID: "e64978ab-e30e-4ebf-bce0-a8e29d5e5adc"). InnerVolumeSpecName "kube-api-access-2gwpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.329322 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26fadc7a-6cf8-4ea0-8609-50e585db4115-kube-api-access-td696" (OuterVolumeSpecName: "kube-api-access-td696") pod "26fadc7a-6cf8-4ea0-8609-50e585db4115" (UID: "26fadc7a-6cf8-4ea0-8609-50e585db4115"). InnerVolumeSpecName "kube-api-access-td696". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.418770 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-operator-scripts\") pod \"35f1cc1f-a736-4c02-9c26-726c0c6f0d59\" (UID: \"35f1cc1f-a736-4c02-9c26-726c0c6f0d59\") " Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.419389 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsf6h\" (UniqueName: \"kubernetes.io/projected/e26c9257-7102-4d48-8999-c0a3f0ca4009-kube-api-access-zsf6h\") pod \"e26c9257-7102-4d48-8999-c0a3f0ca4009\" (UID: \"e26c9257-7102-4d48-8999-c0a3f0ca4009\") " Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.419443 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "35f1cc1f-a736-4c02-9c26-726c0c6f0d59" (UID: "35f1cc1f-a736-4c02-9c26-726c0c6f0d59"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.419463 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvb4p\" (UniqueName: \"kubernetes.io/projected/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-kube-api-access-wvb4p\") pod \"35f1cc1f-a736-4c02-9c26-726c0c6f0d59\" (UID: \"35f1cc1f-a736-4c02-9c26-726c0c6f0d59\") " Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.419516 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e26c9257-7102-4d48-8999-c0a3f0ca4009-operator-scripts\") pod \"e26c9257-7102-4d48-8999-c0a3f0ca4009\" (UID: \"e26c9257-7102-4d48-8999-c0a3f0ca4009\") " Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.419899 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt87n\" (UniqueName: \"kubernetes.io/projected/60ee8426-dcbf-4430-8594-68ee778a8bbc-kube-api-access-rt87n\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.419923 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-td696\" (UniqueName: \"kubernetes.io/projected/26fadc7a-6cf8-4ea0-8609-50e585db4115-kube-api-access-td696\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.419936 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.419953 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gwpw\" (UniqueName: \"kubernetes.io/projected/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-kube-api-access-2gwpw\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.419933 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e26c9257-7102-4d48-8999-c0a3f0ca4009-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e26c9257-7102-4d48-8999-c0a3f0ca4009" (UID: "e26c9257-7102-4d48-8999-c0a3f0ca4009"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.424175 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e26c9257-7102-4d48-8999-c0a3f0ca4009-kube-api-access-zsf6h" (OuterVolumeSpecName: "kube-api-access-zsf6h") pod "e26c9257-7102-4d48-8999-c0a3f0ca4009" (UID: "e26c9257-7102-4d48-8999-c0a3f0ca4009"). InnerVolumeSpecName "kube-api-access-zsf6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.424314 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-kube-api-access-wvb4p" (OuterVolumeSpecName: "kube-api-access-wvb4p") pod "35f1cc1f-a736-4c02-9c26-726c0c6f0d59" (UID: "35f1cc1f-a736-4c02-9c26-726c0c6f0d59"). InnerVolumeSpecName "kube-api-access-wvb4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.457376 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-46zbc" event={"ID":"60ee8426-dcbf-4430-8594-68ee778a8bbc","Type":"ContainerDied","Data":"601e52925bb957989c7f25f9d646d8656693a83832f601e09c2331054b955310"} Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.457448 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="601e52925bb957989c7f25f9d646d8656693a83832f601e09c2331054b955310" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.457410 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-46zbc" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.460847 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d59c-account-create-update-phgft" event={"ID":"35f1cc1f-a736-4c02-9c26-726c0c6f0d59","Type":"ContainerDied","Data":"98275f0e919ccf44d3043fc8e10d4f4fd1dfbb384a1679b9b4a328c739fa2680"} Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.460977 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98275f0e919ccf44d3043fc8e10d4f4fd1dfbb384a1679b9b4a328c739fa2680" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.460865 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d59c-account-create-update-phgft" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.462310 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98b2-account-create-update-648xj" event={"ID":"26fadc7a-6cf8-4ea0-8609-50e585db4115","Type":"ContainerDied","Data":"86de1d053faa9d936fb63918e324800451303ce7017f7f3db74c27bc93776276"} Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.462364 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-98b2-account-create-update-648xj" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.462382 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86de1d053faa9d936fb63918e324800451303ce7017f7f3db74c27bc93776276" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.464624 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-7982-account-create-update-pd5b7" event={"ID":"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc","Type":"ContainerDied","Data":"584460ac40b379634789213fb9875e27bc44f0755fab8cd37c4bc1a2c224a708"} Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.464674 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7982-account-create-update-pd5b7" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.464662 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="584460ac40b379634789213fb9875e27bc44f0755fab8cd37c4bc1a2c224a708" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.466095 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hdmw5" event={"ID":"e26c9257-7102-4d48-8999-c0a3f0ca4009","Type":"ContainerDied","Data":"ff80344ef1bac1d5d7fbda5968ed0c5c11a256e94e9d793d91dda8265547ba1a"} Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.466145 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff80344ef1bac1d5d7fbda5968ed0c5c11a256e94e9d793d91dda8265547ba1a" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.466221 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hdmw5" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.521220 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsf6h\" (UniqueName: \"kubernetes.io/projected/e26c9257-7102-4d48-8999-c0a3f0ca4009-kube-api-access-zsf6h\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.521253 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvb4p\" (UniqueName: \"kubernetes.io/projected/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-kube-api-access-wvb4p\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.521265 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e26c9257-7102-4d48-8999-c0a3f0ca4009-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:19 crc kubenswrapper[4804]: I0217 13:46:19.170346 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:19 crc kubenswrapper[4804]: I0217 13:46:19.245699 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mp4l9"] Feb 17 13:46:19 crc kubenswrapper[4804]: I0217 13:46:19.250872 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" podUID="86aca321-b4a3-4d89-ab34-5d311aa11fe9" containerName="dnsmasq-dns" containerID="cri-o://f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49" gracePeriod=10 Feb 17 13:46:19 crc kubenswrapper[4804]: I0217 13:46:19.480345 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dgzbs" event={"ID":"fd9036c7-1cff-4fb8-9af2-90057c4251dc","Type":"ContainerStarted","Data":"ed7f04a5bf7a47131ede3cac958534ed66f33e1ae426c629f9157f389db06cde"} Feb 17 13:46:19 crc kubenswrapper[4804]: I0217 13:46:19.505100 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-dgzbs" podStartSLOduration=2.243136194 podStartE2EDuration="7.505071385s" podCreationTimestamp="2026-02-17 13:46:12 +0000 UTC" firstStartedPulling="2026-02-17 13:46:13.350576067 +0000 UTC m=+1247.461995404" lastFinishedPulling="2026-02-17 13:46:18.612511248 +0000 UTC m=+1252.723930595" observedRunningTime="2026-02-17 13:46:19.499256723 +0000 UTC m=+1253.610676060" watchObservedRunningTime="2026-02-17 13:46:19.505071385 +0000 UTC m=+1253.616490722" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.238494 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.351596 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-dns-svc\") pod \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.351750 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-sb\") pod \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.351809 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-config\") pod \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.351874 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfbbq\" (UniqueName: \"kubernetes.io/projected/86aca321-b4a3-4d89-ab34-5d311aa11fe9-kube-api-access-zfbbq\") pod \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.351930 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-nb\") pod \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.358813 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86aca321-b4a3-4d89-ab34-5d311aa11fe9-kube-api-access-zfbbq" (OuterVolumeSpecName: "kube-api-access-zfbbq") pod "86aca321-b4a3-4d89-ab34-5d311aa11fe9" (UID: "86aca321-b4a3-4d89-ab34-5d311aa11fe9"). InnerVolumeSpecName "kube-api-access-zfbbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.399284 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "86aca321-b4a3-4d89-ab34-5d311aa11fe9" (UID: "86aca321-b4a3-4d89-ab34-5d311aa11fe9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.405554 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "86aca321-b4a3-4d89-ab34-5d311aa11fe9" (UID: "86aca321-b4a3-4d89-ab34-5d311aa11fe9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.414880 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "86aca321-b4a3-4d89-ab34-5d311aa11fe9" (UID: "86aca321-b4a3-4d89-ab34-5d311aa11fe9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.419918 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-config" (OuterVolumeSpecName: "config") pod "86aca321-b4a3-4d89-ab34-5d311aa11fe9" (UID: "86aca321-b4a3-4d89-ab34-5d311aa11fe9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.454163 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.454240 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.454255 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfbbq\" (UniqueName: \"kubernetes.io/projected/86aca321-b4a3-4d89-ab34-5d311aa11fe9-kube-api-access-zfbbq\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.454266 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.454277 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.492080 4804 generic.go:334] "Generic (PLEG): container finished" podID="86aca321-b4a3-4d89-ab34-5d311aa11fe9" containerID="f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49" exitCode=0 Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.492177 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.492266 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" event={"ID":"86aca321-b4a3-4d89-ab34-5d311aa11fe9","Type":"ContainerDied","Data":"f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49"} Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.492311 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" event={"ID":"86aca321-b4a3-4d89-ab34-5d311aa11fe9","Type":"ContainerDied","Data":"cc022082e5090f1a0915d5020212d3b1d395c728921a669b0ed6b89573f0530f"} Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.492341 4804 scope.go:117] "RemoveContainer" containerID="f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.541521 4804 scope.go:117] "RemoveContainer" containerID="2dffc4a386295a9c8c5efcda69222c6e5a062401a2d3d79656ce2b29324ac9c0" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.542387 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mp4l9"] Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.552063 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mp4l9"] Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.570380 4804 scope.go:117] "RemoveContainer" containerID="f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49" Feb 17 13:46:20 crc kubenswrapper[4804]: E0217 13:46:20.570967 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49\": container with ID starting with f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49 not found: ID does not exist" containerID="f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.571024 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49"} err="failed to get container status \"f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49\": rpc error: code = NotFound desc = could not find container \"f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49\": container with ID starting with f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49 not found: ID does not exist" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.571057 4804 scope.go:117] "RemoveContainer" containerID="2dffc4a386295a9c8c5efcda69222c6e5a062401a2d3d79656ce2b29324ac9c0" Feb 17 13:46:20 crc kubenswrapper[4804]: E0217 13:46:20.571559 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dffc4a386295a9c8c5efcda69222c6e5a062401a2d3d79656ce2b29324ac9c0\": container with ID starting with 2dffc4a386295a9c8c5efcda69222c6e5a062401a2d3d79656ce2b29324ac9c0 not found: ID does not exist" containerID="2dffc4a386295a9c8c5efcda69222c6e5a062401a2d3d79656ce2b29324ac9c0" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.571716 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dffc4a386295a9c8c5efcda69222c6e5a062401a2d3d79656ce2b29324ac9c0"} err="failed to get container status \"2dffc4a386295a9c8c5efcda69222c6e5a062401a2d3d79656ce2b29324ac9c0\": rpc error: code = NotFound desc = could not find container \"2dffc4a386295a9c8c5efcda69222c6e5a062401a2d3d79656ce2b29324ac9c0\": container with ID starting with 2dffc4a386295a9c8c5efcda69222c6e5a062401a2d3d79656ce2b29324ac9c0 not found: ID does not exist" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.586079 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86aca321-b4a3-4d89-ab34-5d311aa11fe9" path="/var/lib/kubelet/pods/86aca321-b4a3-4d89-ab34-5d311aa11fe9/volumes" Feb 17 13:46:22 crc kubenswrapper[4804]: I0217 13:46:22.512100 4804 generic.go:334] "Generic (PLEG): container finished" podID="fd9036c7-1cff-4fb8-9af2-90057c4251dc" containerID="ed7f04a5bf7a47131ede3cac958534ed66f33e1ae426c629f9157f389db06cde" exitCode=0 Feb 17 13:46:22 crc kubenswrapper[4804]: I0217 13:46:22.512214 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dgzbs" event={"ID":"fd9036c7-1cff-4fb8-9af2-90057c4251dc","Type":"ContainerDied","Data":"ed7f04a5bf7a47131ede3cac958534ed66f33e1ae426c629f9157f389db06cde"} Feb 17 13:46:23 crc kubenswrapper[4804]: I0217 13:46:23.823862 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:23 crc kubenswrapper[4804]: I0217 13:46:23.913500 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-combined-ca-bundle\") pod \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " Feb 17 13:46:23 crc kubenswrapper[4804]: I0217 13:46:23.913608 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-config-data\") pod \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " Feb 17 13:46:23 crc kubenswrapper[4804]: I0217 13:46:23.913710 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn9kw\" (UniqueName: \"kubernetes.io/projected/fd9036c7-1cff-4fb8-9af2-90057c4251dc-kube-api-access-rn9kw\") pod \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " Feb 17 13:46:23 crc kubenswrapper[4804]: I0217 13:46:23.918883 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd9036c7-1cff-4fb8-9af2-90057c4251dc-kube-api-access-rn9kw" (OuterVolumeSpecName: "kube-api-access-rn9kw") pod "fd9036c7-1cff-4fb8-9af2-90057c4251dc" (UID: "fd9036c7-1cff-4fb8-9af2-90057c4251dc"). InnerVolumeSpecName "kube-api-access-rn9kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:23 crc kubenswrapper[4804]: I0217 13:46:23.936348 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd9036c7-1cff-4fb8-9af2-90057c4251dc" (UID: "fd9036c7-1cff-4fb8-9af2-90057c4251dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:23 crc kubenswrapper[4804]: I0217 13:46:23.960999 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-config-data" (OuterVolumeSpecName: "config-data") pod "fd9036c7-1cff-4fb8-9af2-90057c4251dc" (UID: "fd9036c7-1cff-4fb8-9af2-90057c4251dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.015743 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.015783 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.015799 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn9kw\" (UniqueName: \"kubernetes.io/projected/fd9036c7-1cff-4fb8-9af2-90057c4251dc-kube-api-access-rn9kw\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.530430 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dgzbs" event={"ID":"fd9036c7-1cff-4fb8-9af2-90057c4251dc","Type":"ContainerDied","Data":"1b6b5b01d9570776dd1a52c7a9cea46e9e30d04a83c7ce13cfd62e96901e77de"} Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.530470 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b6b5b01d9570776dd1a52c7a9cea46e9e30d04a83c7ce13cfd62e96901e77de" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.530536 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.799456 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-m5j4j"] Feb 17 13:46:24 crc kubenswrapper[4804]: E0217 13:46:24.799880 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60ee8426-dcbf-4430-8594-68ee778a8bbc" containerName="mariadb-database-create" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.799900 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="60ee8426-dcbf-4430-8594-68ee778a8bbc" containerName="mariadb-database-create" Feb 17 13:46:24 crc kubenswrapper[4804]: E0217 13:46:24.799922 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd9036c7-1cff-4fb8-9af2-90057c4251dc" containerName="keystone-db-sync" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.799931 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd9036c7-1cff-4fb8-9af2-90057c4251dc" containerName="keystone-db-sync" Feb 17 13:46:24 crc kubenswrapper[4804]: E0217 13:46:24.799949 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86aca321-b4a3-4d89-ab34-5d311aa11fe9" containerName="init" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.799958 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="86aca321-b4a3-4d89-ab34-5d311aa11fe9" containerName="init" Feb 17 13:46:24 crc kubenswrapper[4804]: E0217 13:46:24.799971 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e26c9257-7102-4d48-8999-c0a3f0ca4009" containerName="mariadb-database-create" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.799978 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e26c9257-7102-4d48-8999-c0a3f0ca4009" containerName="mariadb-database-create" Feb 17 13:46:24 crc kubenswrapper[4804]: E0217 13:46:24.799993 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26fadc7a-6cf8-4ea0-8609-50e585db4115" containerName="mariadb-account-create-update" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800002 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="26fadc7a-6cf8-4ea0-8609-50e585db4115" containerName="mariadb-account-create-update" Feb 17 13:46:24 crc kubenswrapper[4804]: E0217 13:46:24.800012 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4895769c-ef45-40c8-a8ae-0c5cb954dab2" containerName="mariadb-database-create" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800019 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4895769c-ef45-40c8-a8ae-0c5cb954dab2" containerName="mariadb-database-create" Feb 17 13:46:24 crc kubenswrapper[4804]: E0217 13:46:24.800037 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f1cc1f-a736-4c02-9c26-726c0c6f0d59" containerName="mariadb-account-create-update" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800045 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f1cc1f-a736-4c02-9c26-726c0c6f0d59" containerName="mariadb-account-create-update" Feb 17 13:46:24 crc kubenswrapper[4804]: E0217 13:46:24.800063 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64978ab-e30e-4ebf-bce0-a8e29d5e5adc" containerName="mariadb-account-create-update" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800070 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64978ab-e30e-4ebf-bce0-a8e29d5e5adc" containerName="mariadb-account-create-update" Feb 17 13:46:24 crc kubenswrapper[4804]: E0217 13:46:24.800083 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86aca321-b4a3-4d89-ab34-5d311aa11fe9" containerName="dnsmasq-dns" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800090 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="86aca321-b4a3-4d89-ab34-5d311aa11fe9" containerName="dnsmasq-dns" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800287 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="86aca321-b4a3-4d89-ab34-5d311aa11fe9" containerName="dnsmasq-dns" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800305 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="26fadc7a-6cf8-4ea0-8609-50e585db4115" containerName="mariadb-account-create-update" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800315 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="60ee8426-dcbf-4430-8594-68ee778a8bbc" containerName="mariadb-database-create" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800329 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="35f1cc1f-a736-4c02-9c26-726c0c6f0d59" containerName="mariadb-account-create-update" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800341 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="4895769c-ef45-40c8-a8ae-0c5cb954dab2" containerName="mariadb-database-create" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800353 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e26c9257-7102-4d48-8999-c0a3f0ca4009" containerName="mariadb-database-create" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800361 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e64978ab-e30e-4ebf-bce0-a8e29d5e5adc" containerName="mariadb-account-create-update" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800370 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd9036c7-1cff-4fb8-9af2-90057c4251dc" containerName="keystone-db-sync" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.801380 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.821467 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-m5j4j"] Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.830886 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.830967 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.831027 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.831088 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.831119 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-config\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.831161 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prjx2\" (UniqueName: \"kubernetes.io/projected/c8639d8d-c367-40a9-b26c-c7c301b82609-kube-api-access-prjx2\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.842120 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-kmbrx"] Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.843218 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.846974 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.847318 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.847700 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.848476 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.848631 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2fq28" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.877674 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kmbrx"] Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.933614 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-fernet-keys\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.933963 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2bf4\" (UniqueName: \"kubernetes.io/projected/53073bd8-b356-4cb8-a190-db417f233b63-kube-api-access-j2bf4\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.934090 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prjx2\" (UniqueName: \"kubernetes.io/projected/c8639d8d-c367-40a9-b26c-c7c301b82609-kube-api-access-prjx2\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.934247 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.934427 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.934526 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-config-data\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.934635 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-credential-keys\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.934745 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.934855 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-combined-ca-bundle\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.934996 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-scripts\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.935106 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.937041 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-config\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.937622 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.938974 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.939017 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.939285 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-config\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.939580 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.992841 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prjx2\" (UniqueName: \"kubernetes.io/projected/c8639d8d-c367-40a9-b26c-c7c301b82609-kube-api-access-prjx2\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.041063 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-credential-keys\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.041135 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-combined-ca-bundle\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.041176 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-scripts\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.041226 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-fernet-keys\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.041250 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2bf4\" (UniqueName: \"kubernetes.io/projected/53073bd8-b356-4cb8-a190-db417f233b63-kube-api-access-j2bf4\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.041305 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-config-data\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.049260 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-f9zkj"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.054357 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.059869 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-r5hqb" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.060243 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.062786 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-scripts\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.062885 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-fernet-keys\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.066608 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.066911 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-config-data\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.066980 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-combined-ca-bundle\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.082262 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-credential-keys\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.092603 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-f9zkj"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.093111 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2bf4\" (UniqueName: \"kubernetes.io/projected/53073bd8-b356-4cb8-a190-db417f233b63-kube-api-access-j2bf4\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.114435 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-d659f57fc-rp4h6"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.116183 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.121565 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.121803 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-75jkk" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.122003 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.123882 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.124613 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.142661 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-d659f57fc-rp4h6"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.151014 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-db-sync-config-data\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.151078 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-horizon-secret-key\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.151103 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-scripts\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.151124 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-config-data\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.151154 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-config-data\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.151173 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-scripts\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.151212 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-combined-ca-bundle\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.151341 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trmx2\" (UniqueName: \"kubernetes.io/projected/02a921c8-6579-451b-beaf-9832cf900668-kube-api-access-trmx2\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.151371 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42bnj\" (UniqueName: \"kubernetes.io/projected/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-kube-api-access-42bnj\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.151398 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/02a921c8-6579-451b-beaf-9832cf900668-etc-machine-id\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.151420 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-logs\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.168576 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.204697 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.207618 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.216948 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.217178 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.230315 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253492 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-combined-ca-bundle\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253547 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-log-httpd\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253600 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jjjd\" (UniqueName: \"kubernetes.io/projected/e5ccd477-88cd-4284-9de7-f336def1c7a1-kube-api-access-8jjjd\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253621 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-run-httpd\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253645 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253683 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trmx2\" (UniqueName: \"kubernetes.io/projected/02a921c8-6579-451b-beaf-9832cf900668-kube-api-access-trmx2\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253727 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42bnj\" (UniqueName: \"kubernetes.io/projected/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-kube-api-access-42bnj\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253750 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/02a921c8-6579-451b-beaf-9832cf900668-etc-machine-id\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253797 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-logs\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253824 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-db-sync-config-data\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253847 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-scripts\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253880 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-horizon-secret-key\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253911 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-scripts\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253933 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-config-data\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253964 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253991 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-config-data\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.254017 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-config-data\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.254041 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-scripts\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.255184 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-logs\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.258025 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-config-data\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.260418 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-scripts\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.260758 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/02a921c8-6579-451b-beaf-9832cf900668-etc-machine-id\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.261423 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-db-sync-config-data\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.262946 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-horizon-secret-key\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.268644 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-scripts\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.269120 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-combined-ca-bundle\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.302666 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-config-data\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.314434 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trmx2\" (UniqueName: \"kubernetes.io/projected/02a921c8-6579-451b-beaf-9832cf900668-kube-api-access-trmx2\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.314554 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42bnj\" (UniqueName: \"kubernetes.io/projected/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-kube-api-access-42bnj\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.337859 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-jltn7"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.340616 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.355558 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.355888 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.356057 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mckmx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.357107 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-scripts\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.366102 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-scripts\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.369527 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.369606 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-config-data\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.369728 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-log-httpd\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.369824 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jjjd\" (UniqueName: \"kubernetes.io/projected/e5ccd477-88cd-4284-9de7-f336def1c7a1-kube-api-access-8jjjd\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.369847 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-run-httpd\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.369875 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.372538 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-log-httpd\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.373362 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-run-httpd\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.378044 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.380019 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5c54d4859c-6cf2w"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.380837 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.382701 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.389519 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-config-data\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.397659 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-jz9x9"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.398729 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.406854 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jltn7"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.432098 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-jz9x9"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.432359 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.436037 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6zhqd" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.449458 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jjjd\" (UniqueName: \"kubernetes.io/projected/e5ccd477-88cd-4284-9de7-f336def1c7a1-kube-api-access-8jjjd\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.452392 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c54d4859c-6cf2w"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.463054 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.464777 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.469641 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.469820 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-r5s28" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.469929 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.470126 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.471009 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-scripts\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.471045 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v577c\" (UniqueName: \"kubernetes.io/projected/19dd0c13-b898-4147-ae5f-cbc5d4915910-kube-api-access-v577c\") pod \"barbican-db-sync-jz9x9\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.471064 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/429f2d90-393a-4205-9597-4a1d92dd15be-logs\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.471084 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-combined-ca-bundle\") pod \"barbican-db-sync-jz9x9\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.471112 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/429f2d90-393a-4205-9597-4a1d92dd15be-horizon-secret-key\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.471132 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-config\") pod \"neutron-db-sync-jltn7\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.471158 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-combined-ca-bundle\") pod \"neutron-db-sync-jltn7\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.471216 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-config-data\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.471237 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58rdh\" (UniqueName: \"kubernetes.io/projected/f15102ce-82ca-49c8-a069-25469380b043-kube-api-access-58rdh\") pod \"neutron-db-sync-jltn7\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.471256 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-db-sync-config-data\") pod \"barbican-db-sync-jz9x9\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.471284 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f28sx\" (UniqueName: \"kubernetes.io/projected/429f2d90-393a-4205-9597-4a1d92dd15be-kube-api-access-f28sx\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.473477 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-m5j4j"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.480882 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.496601 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.498153 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.506374 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-xf9m6"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.507586 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.514254 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mtwfj"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.515595 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.522829 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.523377 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-dr6jm" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.523506 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.523793 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.523911 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.530010 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.558276 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xf9m6"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.560094 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585176 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585237 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-config-data\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585260 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-scripts\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585279 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585306 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58rdh\" (UniqueName: \"kubernetes.io/projected/f15102ce-82ca-49c8-a069-25469380b043-kube-api-access-58rdh\") pod \"neutron-db-sync-jltn7\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585326 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585352 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-db-sync-config-data\") pod \"barbican-db-sync-jz9x9\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585382 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f28sx\" (UniqueName: \"kubernetes.io/projected/429f2d90-393a-4205-9597-4a1d92dd15be-kube-api-access-f28sx\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585406 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585429 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585447 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-scripts\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585466 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585482 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585496 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-logs\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585516 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585534 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bfbg\" (UniqueName: \"kubernetes.io/projected/1fa3f342-a062-421d-8c06-f53468a8db00-kube-api-access-9bfbg\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585549 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585566 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585582 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.586296 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.586333 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-config-data\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.586702 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcv6c\" (UniqueName: \"kubernetes.io/projected/e06838c2-047e-4746-bb20-735a1eb9cb37-kube-api-access-lcv6c\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.586725 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-scripts\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.587456 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-scripts\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.587492 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v577c\" (UniqueName: \"kubernetes.io/projected/19dd0c13-b898-4147-ae5f-cbc5d4915910-kube-api-access-v577c\") pod \"barbican-db-sync-jz9x9\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.587678 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/429f2d90-393a-4205-9597-4a1d92dd15be-logs\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.587804 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-config-data\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.587906 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/429f2d90-393a-4205-9597-4a1d92dd15be-logs\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.587939 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-combined-ca-bundle\") pod \"barbican-db-sync-jz9x9\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.588322 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.588427 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/429f2d90-393a-4205-9597-4a1d92dd15be-horizon-secret-key\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.588587 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-config\") pod \"neutron-db-sync-jltn7\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.588959 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-logs\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.588984 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-combined-ca-bundle\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.589008 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snn6s\" (UniqueName: \"kubernetes.io/projected/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-kube-api-access-snn6s\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.589036 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.589059 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9lw6\" (UniqueName: \"kubernetes.io/projected/ae662df3-8898-4509-b820-2a918ad3ad7a-kube-api-access-f9lw6\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.589086 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-combined-ca-bundle\") pod \"neutron-db-sync-jltn7\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.589103 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-logs\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.589130 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-config\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.589152 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-config-data\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.590582 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mtwfj"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.590968 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.603727 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-combined-ca-bundle\") pod \"neutron-db-sync-jltn7\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.685749 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f28sx\" (UniqueName: \"kubernetes.io/projected/429f2d90-393a-4205-9597-4a1d92dd15be-kube-api-access-f28sx\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.687711 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-config\") pod \"neutron-db-sync-jltn7\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.687965 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-combined-ca-bundle\") pod \"barbican-db-sync-jz9x9\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.688059 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-db-sync-config-data\") pod \"barbican-db-sync-jz9x9\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.688384 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/429f2d90-393a-4205-9597-4a1d92dd15be-horizon-secret-key\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.688799 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v577c\" (UniqueName: \"kubernetes.io/projected/19dd0c13-b898-4147-ae5f-cbc5d4915910-kube-api-access-v577c\") pod \"barbican-db-sync-jz9x9\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.691332 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58rdh\" (UniqueName: \"kubernetes.io/projected/f15102ce-82ca-49c8-a069-25469380b043-kube-api-access-58rdh\") pod \"neutron-db-sync-jltn7\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.694186 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698386 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcv6c\" (UniqueName: \"kubernetes.io/projected/e06838c2-047e-4746-bb20-735a1eb9cb37-kube-api-access-lcv6c\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698451 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698494 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-logs\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698518 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-combined-ca-bundle\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698544 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snn6s\" (UniqueName: \"kubernetes.io/projected/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-kube-api-access-snn6s\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698572 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698599 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9lw6\" (UniqueName: \"kubernetes.io/projected/ae662df3-8898-4509-b820-2a918ad3ad7a-kube-api-access-f9lw6\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698631 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-logs\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698659 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-config\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698681 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-config-data\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698700 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698721 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-scripts\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698742 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698767 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698795 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698814 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698834 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-scripts\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698859 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698879 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698895 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-logs\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698918 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698967 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bfbg\" (UniqueName: \"kubernetes.io/projected/1fa3f342-a062-421d-8c06-f53468a8db00-kube-api-access-9bfbg\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698993 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.699023 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.699046 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.699090 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.699116 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-config-data\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.715163 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.716272 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-logs\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.717844 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-logs\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.718550 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.719515 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.720718 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-logs\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.721583 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-config\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.736188 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.750641 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcv6c\" (UniqueName: \"kubernetes.io/projected/e06838c2-047e-4746-bb20-735a1eb9cb37-kube-api-access-lcv6c\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.750668 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.751378 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.752750 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.754689 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-config-data\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.756855 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-combined-ca-bundle\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.759168 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9lw6\" (UniqueName: \"kubernetes.io/projected/ae662df3-8898-4509-b820-2a918ad3ad7a-kube-api-access-f9lw6\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.766873 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.782346 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.782866 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.784470 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.786378 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-config-data\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.786774 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-scripts\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.786867 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.786938 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.793326 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.797579 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.822192 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snn6s\" (UniqueName: \"kubernetes.io/projected/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-kube-api-access-snn6s\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.822229 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-scripts\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.822799 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.823341 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.825408 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.826638 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bfbg\" (UniqueName: \"kubernetes.io/projected/1fa3f342-a062-421d-8c06-f53468a8db00-kube-api-access-9bfbg\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.833651 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.835773 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.835838 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.839714 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.848037 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea496523757895e32a1aaa278701aae787ee0314e5bf63e36e8c688fc2dbc0d7"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.848155 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://ea496523757895e32a1aaa278701aae787ee0314e5bf63e36e8c688fc2dbc0d7" gracePeriod=600 Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.848427 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.902935 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.961515 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-m5j4j"] Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.030007 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.093735 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.161350 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kmbrx"] Feb 17 13:46:26 crc kubenswrapper[4804]: W0217 13:46:26.454343 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02a921c8_6579_451b_beaf_9832cf900668.slice/crio-ef91adb98631667de4f24978b6722ac67d6e2cf414b43820776723b677addd2c WatchSource:0}: Error finding container ef91adb98631667de4f24978b6722ac67d6e2cf414b43820776723b677addd2c: Status 404 returned error can't find the container with id ef91adb98631667de4f24978b6722ac67d6e2cf414b43820776723b677addd2c Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.455518 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-f9zkj"] Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.487006 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c54d4859c-6cf2w"] Feb 17 13:46:26 crc kubenswrapper[4804]: W0217 13:46:26.488272 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod429f2d90_393a_4205_9597_4a1d92dd15be.slice/crio-fce38230f6cfdb1a15c8055882ba0aa717f46a7acf9c7b94e71d76d6f0c3a6ff WatchSource:0}: Error finding container fce38230f6cfdb1a15c8055882ba0aa717f46a7acf9c7b94e71d76d6f0c3a6ff: Status 404 returned error can't find the container with id fce38230f6cfdb1a15c8055882ba0aa717f46a7acf9c7b94e71d76d6f0c3a6ff Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.569460 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kmbrx" event={"ID":"53073bd8-b356-4cb8-a190-db417f233b63","Type":"ContainerStarted","Data":"22cc18bf0c3204362054a8f4e626573eec2da03d934e88db8cdb3c72fda9d1e5"} Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.572333 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="ea496523757895e32a1aaa278701aae787ee0314e5bf63e36e8c688fc2dbc0d7" exitCode=0 Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.572367 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"ea496523757895e32a1aaa278701aae787ee0314e5bf63e36e8c688fc2dbc0d7"} Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.572434 4804 scope.go:117] "RemoveContainer" containerID="0320866c2bb2dbd13ef711a6f5701e23927765988c8998787dbdeb879aaaaa69" Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.584972 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c54d4859c-6cf2w" event={"ID":"429f2d90-393a-4205-9597-4a1d92dd15be","Type":"ContainerStarted","Data":"fce38230f6cfdb1a15c8055882ba0aa717f46a7acf9c7b94e71d76d6f0c3a6ff"} Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.585003 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f9zkj" event={"ID":"02a921c8-6579-451b-beaf-9832cf900668","Type":"ContainerStarted","Data":"ef91adb98631667de4f24978b6722ac67d6e2cf414b43820776723b677addd2c"} Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.585014 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" event={"ID":"c8639d8d-c367-40a9-b26c-c7c301b82609","Type":"ContainerStarted","Data":"46aadc0bc9a1c8319216c31393e9e2b8f0ba23fa61327f5c2056cca1df2f582b"} Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.661552 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-d659f57fc-rp4h6"] Feb 17 13:46:26 crc kubenswrapper[4804]: W0217 13:46:26.718496 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf15102ce_82ca_49c8_a069_25469380b043.slice/crio-87611d75a6337c3c8516a80920686721f7c8d0bd90cc4c5bcd9d5239128f8f14 WatchSource:0}: Error finding container 87611d75a6337c3c8516a80920686721f7c8d0bd90cc4c5bcd9d5239128f8f14: Status 404 returned error can't find the container with id 87611d75a6337c3c8516a80920686721f7c8d0bd90cc4c5bcd9d5239128f8f14 Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.721342 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jltn7"] Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.066837 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.087921 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mtwfj"] Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.096384 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xf9m6"] Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.105476 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-jz9x9"] Feb 17 13:46:27 crc kubenswrapper[4804]: W0217 13:46:27.115555 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fa3f342_a062_421d_8c06_f53468a8db00.slice/crio-cdb7f4453bccc68342ae31db3bbfc987aaa5d47b283d10e4a4bd0daebe7bbf50 WatchSource:0}: Error finding container cdb7f4453bccc68342ae31db3bbfc987aaa5d47b283d10e4a4bd0daebe7bbf50: Status 404 returned error can't find the container with id cdb7f4453bccc68342ae31db3bbfc987aaa5d47b283d10e4a4bd0daebe7bbf50 Feb 17 13:46:27 crc kubenswrapper[4804]: W0217 13:46:27.132743 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19dd0c13_b898_4147_ae5f_cbc5d4915910.slice/crio-3d2919ea140840d1d3c9f9481431391d06220a9f53b8b9586d077d9974ea9874 WatchSource:0}: Error finding container 3d2919ea140840d1d3c9f9481431391d06220a9f53b8b9586d077d9974ea9874: Status 404 returned error can't find the container with id 3d2919ea140840d1d3c9f9481431391d06220a9f53b8b9586d077d9974ea9874 Feb 17 13:46:27 crc kubenswrapper[4804]: W0217 13:46:27.170369 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14e1fc7b_0e6c_4377_b4e0_74e77e951b0d.slice/crio-ff3a70443dcb1450d10696c24e8f78a38b035a2406676a888b6d8c4f0796ab75 WatchSource:0}: Error finding container ff3a70443dcb1450d10696c24e8f78a38b035a2406676a888b6d8c4f0796ab75: Status 404 returned error can't find the container with id ff3a70443dcb1450d10696c24e8f78a38b035a2406676a888b6d8c4f0796ab75 Feb 17 13:46:27 crc kubenswrapper[4804]: W0217 13:46:27.186094 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5ccd477_88cd_4284_9de7_f336def1c7a1.slice/crio-9ab7cd127419d840e73931cd84e8a62cca6dbdb1c678768ac1433e7970f3f9a0 WatchSource:0}: Error finding container 9ab7cd127419d840e73931cd84e8a62cca6dbdb1c678768ac1433e7970f3f9a0: Status 404 returned error can't find the container with id 9ab7cd127419d840e73931cd84e8a62cca6dbdb1c678768ac1433e7970f3f9a0 Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.215855 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.268066 4804 scope.go:117] "RemoveContainer" containerID="936d92768f8545882fd9f589c352b0f3e05694fdb88b93635d612b3de2273f31" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.364831 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.379998 4804 scope.go:117] "RemoveContainer" containerID="4c6c05689b4d8003c577d2fa36fd3fe297914eaa29a6a636dc47b237ac9d795d" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.394836 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c54d4859c-6cf2w"] Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.476079 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6b56868599-9s4h9"] Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.478892 4804 scope.go:117] "RemoveContainer" containerID="a58356e342b8d1a0c197b929d754c94eace180ca8295bdab19e683e521269b3f" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.480043 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.517497 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6b56868599-9s4h9"] Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.530414 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.557037 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.572321 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45pwj\" (UniqueName: \"kubernetes.io/projected/c40906b1-b78c-4e65-9c35-346626adeba3-kube-api-access-45pwj\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.572375 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-config-data\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.572434 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c40906b1-b78c-4e65-9c35-346626adeba3-logs\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.572491 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-scripts\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.572527 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c40906b1-b78c-4e65-9c35-346626adeba3-horizon-secret-key\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.630568 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kmbrx" event={"ID":"53073bd8-b356-4cb8-a190-db417f233b63","Type":"ContainerStarted","Data":"fe58294f85ff06a0d32971760c88b3a7d0ebe711d822c93e180307f22e74f6a0"} Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.645364 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5ccd477-88cd-4284-9de7-f336def1c7a1","Type":"ContainerStarted","Data":"9ab7cd127419d840e73931cd84e8a62cca6dbdb1c678768ac1433e7970f3f9a0"} Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.648550 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e06838c2-047e-4746-bb20-735a1eb9cb37","Type":"ContainerStarted","Data":"9b93e2a2a156279c02c6d5f10d5f5e53f3649c24aa9d244b7707ada20e287204"} Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.652018 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" event={"ID":"1fa3f342-a062-421d-8c06-f53468a8db00","Type":"ContainerStarted","Data":"cdb7f4453bccc68342ae31db3bbfc987aaa5d47b283d10e4a4bd0daebe7bbf50"} Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.658398 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-kmbrx" podStartSLOduration=3.658292 podStartE2EDuration="3.658292s" podCreationTimestamp="2026-02-17 13:46:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:46:27.648885835 +0000 UTC m=+1261.760305172" watchObservedRunningTime="2026-02-17 13:46:27.658292 +0000 UTC m=+1261.769711337" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.665880 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jz9x9" event={"ID":"19dd0c13-b898-4147-ae5f-cbc5d4915910","Type":"ContainerStarted","Data":"3d2919ea140840d1d3c9f9481431391d06220a9f53b8b9586d077d9974ea9874"} Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.676499 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c40906b1-b78c-4e65-9c35-346626adeba3-horizon-secret-key\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.676577 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45pwj\" (UniqueName: \"kubernetes.io/projected/c40906b1-b78c-4e65-9c35-346626adeba3-kube-api-access-45pwj\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.676638 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-config-data\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.676697 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c40906b1-b78c-4e65-9c35-346626adeba3-logs\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.676806 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-scripts\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.677629 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-scripts\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.678550 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-config-data\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.678752 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c40906b1-b78c-4e65-9c35-346626adeba3-logs\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.691284 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"845afb2d1d32c8f1c4420bf9c6d30ae92d7fd53810dea6b094c1e266f88044e6"} Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.699757 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c40906b1-b78c-4e65-9c35-346626adeba3-horizon-secret-key\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.702923 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45pwj\" (UniqueName: \"kubernetes.io/projected/c40906b1-b78c-4e65-9c35-346626adeba3-kube-api-access-45pwj\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.703492 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d659f57fc-rp4h6" event={"ID":"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9","Type":"ContainerStarted","Data":"38398873d4e244754b25fb3ccf4b8d269e3c191649fc54d1a45b9951042bdc7f"} Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.705814 4804 generic.go:334] "Generic (PLEG): container finished" podID="c8639d8d-c367-40a9-b26c-c7c301b82609" containerID="83533c9dfdbc39c545b70abc4b708d58af7e792f59972297b23faa02fcbb40b8" exitCode=0 Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.705862 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" event={"ID":"c8639d8d-c367-40a9-b26c-c7c301b82609","Type":"ContainerDied","Data":"83533c9dfdbc39c545b70abc4b708d58af7e792f59972297b23faa02fcbb40b8"} Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.721067 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jltn7" event={"ID":"f15102ce-82ca-49c8-a069-25469380b043","Type":"ContainerStarted","Data":"4334f8c8c165dce79cf685c7b7ada0d4aa970effa853bf86402b0c64eaa765f2"} Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.721108 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jltn7" event={"ID":"f15102ce-82ca-49c8-a069-25469380b043","Type":"ContainerStarted","Data":"87611d75a6337c3c8516a80920686721f7c8d0bd90cc4c5bcd9d5239128f8f14"} Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.723468 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xf9m6" event={"ID":"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d","Type":"ContainerStarted","Data":"ff3a70443dcb1450d10696c24e8f78a38b035a2406676a888b6d8c4f0796ab75"} Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.753576 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-jltn7" podStartSLOduration=2.753554254 podStartE2EDuration="2.753554254s" podCreationTimestamp="2026-02-17 13:46:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:46:27.743054724 +0000 UTC m=+1261.854474061" watchObservedRunningTime="2026-02-17 13:46:27.753554254 +0000 UTC m=+1261.864973591" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.818512 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.117463 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.187262 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prjx2\" (UniqueName: \"kubernetes.io/projected/c8639d8d-c367-40a9-b26c-c7c301b82609-kube-api-access-prjx2\") pod \"c8639d8d-c367-40a9-b26c-c7c301b82609\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.188287 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-sb\") pod \"c8639d8d-c367-40a9-b26c-c7c301b82609\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.188458 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-nb\") pod \"c8639d8d-c367-40a9-b26c-c7c301b82609\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.188512 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-svc\") pod \"c8639d8d-c367-40a9-b26c-c7c301b82609\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.188566 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-config\") pod \"c8639d8d-c367-40a9-b26c-c7c301b82609\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.188605 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-swift-storage-0\") pod \"c8639d8d-c367-40a9-b26c-c7c301b82609\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.226346 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8639d8d-c367-40a9-b26c-c7c301b82609-kube-api-access-prjx2" (OuterVolumeSpecName: "kube-api-access-prjx2") pod "c8639d8d-c367-40a9-b26c-c7c301b82609" (UID: "c8639d8d-c367-40a9-b26c-c7c301b82609"). InnerVolumeSpecName "kube-api-access-prjx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.233816 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c8639d8d-c367-40a9-b26c-c7c301b82609" (UID: "c8639d8d-c367-40a9-b26c-c7c301b82609"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.238543 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c8639d8d-c367-40a9-b26c-c7c301b82609" (UID: "c8639d8d-c367-40a9-b26c-c7c301b82609"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.242829 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c8639d8d-c367-40a9-b26c-c7c301b82609" (UID: "c8639d8d-c367-40a9-b26c-c7c301b82609"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.244169 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c8639d8d-c367-40a9-b26c-c7c301b82609" (UID: "c8639d8d-c367-40a9-b26c-c7c301b82609"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.250603 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-config" (OuterVolumeSpecName: "config") pod "c8639d8d-c367-40a9-b26c-c7c301b82609" (UID: "c8639d8d-c367-40a9-b26c-c7c301b82609"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.264833 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.292374 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.292405 4804 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.292416 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prjx2\" (UniqueName: \"kubernetes.io/projected/c8639d8d-c367-40a9-b26c-c7c301b82609-kube-api-access-prjx2\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.292425 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.292435 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.293094 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.492808 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6b56868599-9s4h9"] Feb 17 13:46:28 crc kubenswrapper[4804]: W0217 13:46:28.553218 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc40906b1_b78c_4e65_9c35_346626adeba3.slice/crio-0364542f4068f261d045963e550e809dad4e8639c09b095d8e25392e8b58c1ac WatchSource:0}: Error finding container 0364542f4068f261d045963e550e809dad4e8639c09b095d8e25392e8b58c1ac: Status 404 returned error can't find the container with id 0364542f4068f261d045963e550e809dad4e8639c09b095d8e25392e8b58c1ac Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.771270 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" event={"ID":"c8639d8d-c367-40a9-b26c-c7c301b82609","Type":"ContainerDied","Data":"46aadc0bc9a1c8319216c31393e9e2b8f0ba23fa61327f5c2056cca1df2f582b"} Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.771683 4804 scope.go:117] "RemoveContainer" containerID="83533c9dfdbc39c545b70abc4b708d58af7e792f59972297b23faa02fcbb40b8" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.771576 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.777829 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e06838c2-047e-4746-bb20-735a1eb9cb37","Type":"ContainerStarted","Data":"08670815fbd1e9b758c348fb93fa962c48c717b87693dc123c304e7d0ec4d4cd"} Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.780126 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b56868599-9s4h9" event={"ID":"c40906b1-b78c-4e65-9c35-346626adeba3","Type":"ContainerStarted","Data":"0364542f4068f261d045963e550e809dad4e8639c09b095d8e25392e8b58c1ac"} Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.783356 4804 generic.go:334] "Generic (PLEG): container finished" podID="1fa3f342-a062-421d-8c06-f53468a8db00" containerID="63be9f06e01e3909b7ff94ea9b177c0a528139e2942719322a381a426d4f2574" exitCode=0 Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.785319 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" event={"ID":"1fa3f342-a062-421d-8c06-f53468a8db00","Type":"ContainerDied","Data":"63be9f06e01e3909b7ff94ea9b177c0a528139e2942719322a381a426d4f2574"} Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.787964 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ae662df3-8898-4509-b820-2a918ad3ad7a","Type":"ContainerStarted","Data":"49f64571d9f610637811cf86586750a6d15c78928db1891dc4609e905bc4b08c"} Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.831339 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-m5j4j"] Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.847731 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-m5j4j"] Feb 17 13:46:29 crc kubenswrapper[4804]: I0217 13:46:29.863608 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" event={"ID":"1fa3f342-a062-421d-8c06-f53468a8db00","Type":"ContainerStarted","Data":"b9a3f395e90e39b7c24df35dd6e3f0dd7e4bcbc43cd3d4f5483755287749ca41"} Feb 17 13:46:29 crc kubenswrapper[4804]: I0217 13:46:29.864632 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:29 crc kubenswrapper[4804]: I0217 13:46:29.875339 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ae662df3-8898-4509-b820-2a918ad3ad7a","Type":"ContainerStarted","Data":"dca62d14dda6868e926b57148e8cd74b64e632384abd99e1788d3d27c22c4765"} Feb 17 13:46:29 crc kubenswrapper[4804]: I0217 13:46:29.892798 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" podStartSLOduration=4.892770547 podStartE2EDuration="4.892770547s" podCreationTimestamp="2026-02-17 13:46:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:46:29.885111267 +0000 UTC m=+1263.996530614" watchObservedRunningTime="2026-02-17 13:46:29.892770547 +0000 UTC m=+1264.004189874" Feb 17 13:46:30 crc kubenswrapper[4804]: I0217 13:46:30.593731 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8639d8d-c367-40a9-b26c-c7c301b82609" path="/var/lib/kubelet/pods/c8639d8d-c367-40a9-b26c-c7c301b82609/volumes" Feb 17 13:46:30 crc kubenswrapper[4804]: I0217 13:46:30.890822 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e06838c2-047e-4746-bb20-735a1eb9cb37" containerName="glance-log" containerID="cri-o://08670815fbd1e9b758c348fb93fa962c48c717b87693dc123c304e7d0ec4d4cd" gracePeriod=30 Feb 17 13:46:30 crc kubenswrapper[4804]: I0217 13:46:30.891110 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e06838c2-047e-4746-bb20-735a1eb9cb37","Type":"ContainerStarted","Data":"8c54c0fbf665a0f62da46d5e48fa201c7c07fe926f4bbb23290d21ea751f360e"} Feb 17 13:46:30 crc kubenswrapper[4804]: I0217 13:46:30.891422 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e06838c2-047e-4746-bb20-735a1eb9cb37" containerName="glance-httpd" containerID="cri-o://8c54c0fbf665a0f62da46d5e48fa201c7c07fe926f4bbb23290d21ea751f360e" gracePeriod=30 Feb 17 13:46:30 crc kubenswrapper[4804]: I0217 13:46:30.918733 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.918718167 podStartE2EDuration="5.918718167s" podCreationTimestamp="2026-02-17 13:46:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:46:30.913072459 +0000 UTC m=+1265.024491806" watchObservedRunningTime="2026-02-17 13:46:30.918718167 +0000 UTC m=+1265.030137504" Feb 17 13:46:31 crc kubenswrapper[4804]: I0217 13:46:31.906039 4804 generic.go:334] "Generic (PLEG): container finished" podID="53073bd8-b356-4cb8-a190-db417f233b63" containerID="fe58294f85ff06a0d32971760c88b3a7d0ebe711d822c93e180307f22e74f6a0" exitCode=0 Feb 17 13:46:31 crc kubenswrapper[4804]: I0217 13:46:31.906583 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kmbrx" event={"ID":"53073bd8-b356-4cb8-a190-db417f233b63","Type":"ContainerDied","Data":"fe58294f85ff06a0d32971760c88b3a7d0ebe711d822c93e180307f22e74f6a0"} Feb 17 13:46:31 crc kubenswrapper[4804]: I0217 13:46:31.916830 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ae662df3-8898-4509-b820-2a918ad3ad7a","Type":"ContainerStarted","Data":"8d6d4b8225dc05b2f8ac6fe66b04d57f0e324f2f754fb6ddc82de82d73688709"} Feb 17 13:46:31 crc kubenswrapper[4804]: I0217 13:46:31.917000 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ae662df3-8898-4509-b820-2a918ad3ad7a" containerName="glance-log" containerID="cri-o://dca62d14dda6868e926b57148e8cd74b64e632384abd99e1788d3d27c22c4765" gracePeriod=30 Feb 17 13:46:31 crc kubenswrapper[4804]: I0217 13:46:31.917292 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ae662df3-8898-4509-b820-2a918ad3ad7a" containerName="glance-httpd" containerID="cri-o://8d6d4b8225dc05b2f8ac6fe66b04d57f0e324f2f754fb6ddc82de82d73688709" gracePeriod=30 Feb 17 13:46:31 crc kubenswrapper[4804]: I0217 13:46:31.933011 4804 generic.go:334] "Generic (PLEG): container finished" podID="e06838c2-047e-4746-bb20-735a1eb9cb37" containerID="8c54c0fbf665a0f62da46d5e48fa201c7c07fe926f4bbb23290d21ea751f360e" exitCode=0 Feb 17 13:46:31 crc kubenswrapper[4804]: I0217 13:46:31.933052 4804 generic.go:334] "Generic (PLEG): container finished" podID="e06838c2-047e-4746-bb20-735a1eb9cb37" containerID="08670815fbd1e9b758c348fb93fa962c48c717b87693dc123c304e7d0ec4d4cd" exitCode=143 Feb 17 13:46:31 crc kubenswrapper[4804]: I0217 13:46:31.933081 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e06838c2-047e-4746-bb20-735a1eb9cb37","Type":"ContainerDied","Data":"8c54c0fbf665a0f62da46d5e48fa201c7c07fe926f4bbb23290d21ea751f360e"} Feb 17 13:46:31 crc kubenswrapper[4804]: I0217 13:46:31.933115 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e06838c2-047e-4746-bb20-735a1eb9cb37","Type":"ContainerDied","Data":"08670815fbd1e9b758c348fb93fa962c48c717b87693dc123c304e7d0ec4d4cd"} Feb 17 13:46:32 crc kubenswrapper[4804]: I0217 13:46:32.945657 4804 generic.go:334] "Generic (PLEG): container finished" podID="ae662df3-8898-4509-b820-2a918ad3ad7a" containerID="8d6d4b8225dc05b2f8ac6fe66b04d57f0e324f2f754fb6ddc82de82d73688709" exitCode=0 Feb 17 13:46:32 crc kubenswrapper[4804]: I0217 13:46:32.945942 4804 generic.go:334] "Generic (PLEG): container finished" podID="ae662df3-8898-4509-b820-2a918ad3ad7a" containerID="dca62d14dda6868e926b57148e8cd74b64e632384abd99e1788d3d27c22c4765" exitCode=143 Feb 17 13:46:32 crc kubenswrapper[4804]: I0217 13:46:32.945821 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ae662df3-8898-4509-b820-2a918ad3ad7a","Type":"ContainerDied","Data":"8d6d4b8225dc05b2f8ac6fe66b04d57f0e324f2f754fb6ddc82de82d73688709"} Feb 17 13:46:32 crc kubenswrapper[4804]: I0217 13:46:32.946182 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ae662df3-8898-4509-b820-2a918ad3ad7a","Type":"ContainerDied","Data":"dca62d14dda6868e926b57148e8cd74b64e632384abd99e1788d3d27c22c4765"} Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.616090 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.624573 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.654335 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.654311106 podStartE2EDuration="8.654311106s" podCreationTimestamp="2026-02-17 13:46:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:46:31.97438246 +0000 UTC m=+1266.085801807" watchObservedRunningTime="2026-02-17 13:46:33.654311106 +0000 UTC m=+1267.765730443" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.715734 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-combined-ca-bundle\") pod \"53073bd8-b356-4cb8-a190-db417f233b63\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.715792 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-logs\") pod \"e06838c2-047e-4746-bb20-735a1eb9cb37\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.715827 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-combined-ca-bundle\") pod \"e06838c2-047e-4746-bb20-735a1eb9cb37\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.715868 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-scripts\") pod \"e06838c2-047e-4746-bb20-735a1eb9cb37\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.715899 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-public-tls-certs\") pod \"e06838c2-047e-4746-bb20-735a1eb9cb37\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.715946 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"e06838c2-047e-4746-bb20-735a1eb9cb37\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.715977 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-credential-keys\") pod \"53073bd8-b356-4cb8-a190-db417f233b63\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.715995 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcv6c\" (UniqueName: \"kubernetes.io/projected/e06838c2-047e-4746-bb20-735a1eb9cb37-kube-api-access-lcv6c\") pod \"e06838c2-047e-4746-bb20-735a1eb9cb37\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.716034 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-scripts\") pod \"53073bd8-b356-4cb8-a190-db417f233b63\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.716066 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-config-data\") pod \"53073bd8-b356-4cb8-a190-db417f233b63\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.716094 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-config-data\") pod \"e06838c2-047e-4746-bb20-735a1eb9cb37\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.716123 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-fernet-keys\") pod \"53073bd8-b356-4cb8-a190-db417f233b63\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.716157 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-httpd-run\") pod \"e06838c2-047e-4746-bb20-735a1eb9cb37\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.716221 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2bf4\" (UniqueName: \"kubernetes.io/projected/53073bd8-b356-4cb8-a190-db417f233b63-kube-api-access-j2bf4\") pod \"53073bd8-b356-4cb8-a190-db417f233b63\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.717219 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e06838c2-047e-4746-bb20-735a1eb9cb37" (UID: "e06838c2-047e-4746-bb20-735a1eb9cb37"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.729978 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-scripts" (OuterVolumeSpecName: "scripts") pod "e06838c2-047e-4746-bb20-735a1eb9cb37" (UID: "e06838c2-047e-4746-bb20-735a1eb9cb37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.733572 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-logs" (OuterVolumeSpecName: "logs") pod "e06838c2-047e-4746-bb20-735a1eb9cb37" (UID: "e06838c2-047e-4746-bb20-735a1eb9cb37"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.748465 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "e06838c2-047e-4746-bb20-735a1eb9cb37" (UID: "e06838c2-047e-4746-bb20-735a1eb9cb37"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.748637 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "53073bd8-b356-4cb8-a190-db417f233b63" (UID: "53073bd8-b356-4cb8-a190-db417f233b63"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.749308 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e06838c2-047e-4746-bb20-735a1eb9cb37-kube-api-access-lcv6c" (OuterVolumeSpecName: "kube-api-access-lcv6c") pod "e06838c2-047e-4746-bb20-735a1eb9cb37" (UID: "e06838c2-047e-4746-bb20-735a1eb9cb37"). InnerVolumeSpecName "kube-api-access-lcv6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.752650 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53073bd8-b356-4cb8-a190-db417f233b63-kube-api-access-j2bf4" (OuterVolumeSpecName: "kube-api-access-j2bf4") pod "53073bd8-b356-4cb8-a190-db417f233b63" (UID: "53073bd8-b356-4cb8-a190-db417f233b63"). InnerVolumeSpecName "kube-api-access-j2bf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.755527 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "53073bd8-b356-4cb8-a190-db417f233b63" (UID: "53073bd8-b356-4cb8-a190-db417f233b63"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.756506 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-scripts" (OuterVolumeSpecName: "scripts") pod "53073bd8-b356-4cb8-a190-db417f233b63" (UID: "53073bd8-b356-4cb8-a190-db417f233b63"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.778687 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-config-data" (OuterVolumeSpecName: "config-data") pod "53073bd8-b356-4cb8-a190-db417f233b63" (UID: "53073bd8-b356-4cb8-a190-db417f233b63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.789589 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53073bd8-b356-4cb8-a190-db417f233b63" (UID: "53073bd8-b356-4cb8-a190-db417f233b63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.793238 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e06838c2-047e-4746-bb20-735a1eb9cb37" (UID: "e06838c2-047e-4746-bb20-735a1eb9cb37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.797719 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e06838c2-047e-4746-bb20-735a1eb9cb37" (UID: "e06838c2-047e-4746-bb20-735a1eb9cb37"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817844 4804 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817885 4804 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817900 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcv6c\" (UniqueName: \"kubernetes.io/projected/e06838c2-047e-4746-bb20-735a1eb9cb37-kube-api-access-lcv6c\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817912 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817921 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817933 4804 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817945 4804 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817956 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2bf4\" (UniqueName: \"kubernetes.io/projected/53073bd8-b356-4cb8-a190-db417f233b63-kube-api-access-j2bf4\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817965 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817974 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817986 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817999 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.818009 4804 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.821150 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-config-data" (OuterVolumeSpecName: "config-data") pod "e06838c2-047e-4746-bb20-735a1eb9cb37" (UID: "e06838c2-047e-4746-bb20-735a1eb9cb37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.852553 4804 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.919044 4804 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.919076 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.968084 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e06838c2-047e-4746-bb20-735a1eb9cb37","Type":"ContainerDied","Data":"9b93e2a2a156279c02c6d5f10d5f5e53f3649c24aa9d244b7707ada20e287204"} Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.968094 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.968149 4804 scope.go:117] "RemoveContainer" containerID="8c54c0fbf665a0f62da46d5e48fa201c7c07fe926f4bbb23290d21ea751f360e" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.971604 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kmbrx" event={"ID":"53073bd8-b356-4cb8-a190-db417f233b63","Type":"ContainerDied","Data":"22cc18bf0c3204362054a8f4e626573eec2da03d934e88db8cdb3c72fda9d1e5"} Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.971647 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.971647 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22cc18bf0c3204362054a8f4e626573eec2da03d934e88db8cdb3c72fda9d1e5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.017763 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.055395 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.068179 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:46:34 crc kubenswrapper[4804]: E0217 13:46:34.068649 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e06838c2-047e-4746-bb20-735a1eb9cb37" containerName="glance-log" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.068672 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e06838c2-047e-4746-bb20-735a1eb9cb37" containerName="glance-log" Feb 17 13:46:34 crc kubenswrapper[4804]: E0217 13:46:34.068709 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e06838c2-047e-4746-bb20-735a1eb9cb37" containerName="glance-httpd" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.068717 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e06838c2-047e-4746-bb20-735a1eb9cb37" containerName="glance-httpd" Feb 17 13:46:34 crc kubenswrapper[4804]: E0217 13:46:34.068731 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53073bd8-b356-4cb8-a190-db417f233b63" containerName="keystone-bootstrap" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.068739 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="53073bd8-b356-4cb8-a190-db417f233b63" containerName="keystone-bootstrap" Feb 17 13:46:34 crc kubenswrapper[4804]: E0217 13:46:34.068754 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8639d8d-c367-40a9-b26c-c7c301b82609" containerName="init" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.068761 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8639d8d-c367-40a9-b26c-c7c301b82609" containerName="init" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.068963 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="53073bd8-b356-4cb8-a190-db417f233b63" containerName="keystone-bootstrap" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.068988 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e06838c2-047e-4746-bb20-735a1eb9cb37" containerName="glance-log" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.069010 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8639d8d-c367-40a9-b26c-c7c301b82609" containerName="init" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.069022 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e06838c2-047e-4746-bb20-735a1eb9cb37" containerName="glance-httpd" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.070108 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.072761 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.072968 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.096653 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.116346 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-kmbrx"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.129857 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-kmbrx"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.177892 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-7kgzk"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.183383 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.184894 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7kgzk"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.185866 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.186121 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.186323 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.186500 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2fq28" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.186537 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.223651 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.223746 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.223784 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.223812 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-scripts\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.224104 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.224186 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrhpb\" (UniqueName: \"kubernetes.io/projected/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-kube-api-access-rrhpb\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.224324 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-logs\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.224378 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-config-data\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.331542 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-fernet-keys\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.331610 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-credential-keys\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.331653 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.331739 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrhpb\" (UniqueName: \"kubernetes.io/projected/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-kube-api-access-rrhpb\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.331795 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-combined-ca-bundle\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.331820 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-scripts\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.331880 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-logs\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.331946 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-config-data\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.331989 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-config-data\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.332043 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.332105 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.332131 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhl8p\" (UniqueName: \"kubernetes.io/projected/96609ec5-c9e0-4611-85ff-f7dc474d889a-kube-api-access-rhl8p\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.332172 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.332379 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-scripts\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.332647 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-logs\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.332888 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.333412 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.339802 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-config-data\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.340053 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.340286 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-scripts\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.340434 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.360033 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrhpb\" (UniqueName: \"kubernetes.io/projected/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-kube-api-access-rrhpb\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.376115 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-d659f57fc-rp4h6"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.377397 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.395580 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.413593 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.434490 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-config-data\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.434886 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhl8p\" (UniqueName: \"kubernetes.io/projected/96609ec5-c9e0-4611-85ff-f7dc474d889a-kube-api-access-rhl8p\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.435006 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-fernet-keys\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.435034 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-credential-keys\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.435094 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-combined-ca-bundle\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.435115 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-scripts\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.440159 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-config-data\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.440405 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-scripts\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.440853 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-fernet-keys\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.441455 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-combined-ca-bundle\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.442129 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-credential-keys\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.451775 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-58989b55cb-zjfvf"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.457923 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.474376 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58989b55cb-zjfvf"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.481784 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.489926 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhl8p\" (UniqueName: \"kubernetes.io/projected/96609ec5-c9e0-4611-85ff-f7dc474d889a-kube-api-access-rhl8p\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.496249 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6b56868599-9s4h9"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.514175 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.517259 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-9ffb6f5c6-fczv5"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.519151 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537086 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-combined-ca-bundle\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537175 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-config-data\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537267 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-scripts\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537324 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-combined-ca-bundle\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537351 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-secret-key\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537392 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-logs\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537411 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9vf9\" (UniqueName: \"kubernetes.io/projected/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-kube-api-access-v9vf9\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537448 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-horizon-tls-certs\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537475 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-tls-certs\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537498 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-config-data\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537544 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wfg7\" (UniqueName: \"kubernetes.io/projected/85415d6a-8a5f-4b65-b182-2bfe221e8eee-kube-api-access-6wfg7\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537582 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-scripts\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537613 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-horizon-secret-key\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537737 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85415d6a-8a5f-4b65-b182-2bfe221e8eee-logs\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.545398 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9ffb6f5c6-fczv5"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.594097 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53073bd8-b356-4cb8-a190-db417f233b63" path="/var/lib/kubelet/pods/53073bd8-b356-4cb8-a190-db417f233b63/volumes" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.594995 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e06838c2-047e-4746-bb20-735a1eb9cb37" path="/var/lib/kubelet/pods/e06838c2-047e-4746-bb20-735a1eb9cb37/volumes" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.639257 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-tls-certs\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.639361 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-config-data\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.639407 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wfg7\" (UniqueName: \"kubernetes.io/projected/85415d6a-8a5f-4b65-b182-2bfe221e8eee-kube-api-access-6wfg7\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.639451 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-scripts\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.639507 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-horizon-secret-key\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.639653 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85415d6a-8a5f-4b65-b182-2bfe221e8eee-logs\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.639747 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-combined-ca-bundle\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.639811 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-scripts\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.639841 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-config-data\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.639869 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-combined-ca-bundle\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.639935 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-secret-key\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.640062 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-logs\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.640102 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9vf9\" (UniqueName: \"kubernetes.io/projected/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-kube-api-access-v9vf9\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.640169 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-horizon-tls-certs\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.640984 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-logs\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.641613 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-scripts\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.642461 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85415d6a-8a5f-4b65-b182-2bfe221e8eee-logs\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.643848 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-config-data\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.644321 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-scripts\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.646317 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-config-data\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.659499 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-combined-ca-bundle\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.659552 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-secret-key\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.659591 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-combined-ca-bundle\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.659557 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-tls-certs\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.659893 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-horizon-tls-certs\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.660463 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-horizon-secret-key\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.662480 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wfg7\" (UniqueName: \"kubernetes.io/projected/85415d6a-8a5f-4b65-b182-2bfe221e8eee-kube-api-access-6wfg7\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.664455 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9vf9\" (UniqueName: \"kubernetes.io/projected/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-kube-api-access-v9vf9\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.869628 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.878140 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:36 crc kubenswrapper[4804]: I0217 13:46:36.032584 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:36 crc kubenswrapper[4804]: I0217 13:46:36.106608 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-hp2db"] Feb 17 13:46:36 crc kubenswrapper[4804]: I0217 13:46:36.106904 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" podUID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerName="dnsmasq-dns" containerID="cri-o://d82f8e60d688c7c01688fbedc29bdcd643db8c569309612415da050dc9220d5f" gracePeriod=10 Feb 17 13:46:37 crc kubenswrapper[4804]: I0217 13:46:37.008959 4804 generic.go:334] "Generic (PLEG): container finished" podID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerID="d82f8e60d688c7c01688fbedc29bdcd643db8c569309612415da050dc9220d5f" exitCode=0 Feb 17 13:46:37 crc kubenswrapper[4804]: I0217 13:46:37.009070 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" event={"ID":"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2","Type":"ContainerDied","Data":"d82f8e60d688c7c01688fbedc29bdcd643db8c569309612415da050dc9220d5f"} Feb 17 13:46:39 crc kubenswrapper[4804]: I0217 13:46:39.169937 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" podUID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Feb 17 13:46:44 crc kubenswrapper[4804]: I0217 13:46:44.169941 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" podUID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Feb 17 13:46:45 crc kubenswrapper[4804]: E0217 13:46:45.675447 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 17 13:46:45 crc kubenswrapper[4804]: E0217 13:46:45.676336 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n76h57h5b7hf6h685h68ch58ch597h99h65fh5c9hcbh58dh66chb5hf6h6dh654h648hch589h66fhf7h5d7hb6h7dh5f8h78h598h559h644h56bq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-42bnj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-d659f57fc-rp4h6_openstack(64f8b969-63f0-4c36-baa9-e86e4b0bf0d9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:46:45 crc kubenswrapper[4804]: E0217 13:46:45.680070 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-d659f57fc-rp4h6" podUID="64f8b969-63f0-4c36-baa9-e86e4b0bf0d9" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.108491 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ae662df3-8898-4509-b820-2a918ad3ad7a","Type":"ContainerDied","Data":"49f64571d9f610637811cf86586750a6d15c78928db1891dc4609e905bc4b08c"} Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.108553 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49f64571d9f610637811cf86586750a6d15c78928db1891dc4609e905bc4b08c" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.157186 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.311225 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-config-data\") pod \"ae662df3-8898-4509-b820-2a918ad3ad7a\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.311339 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-httpd-run\") pod \"ae662df3-8898-4509-b820-2a918ad3ad7a\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.311423 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-combined-ca-bundle\") pod \"ae662df3-8898-4509-b820-2a918ad3ad7a\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.311455 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-internal-tls-certs\") pod \"ae662df3-8898-4509-b820-2a918ad3ad7a\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.311505 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-scripts\") pod \"ae662df3-8898-4509-b820-2a918ad3ad7a\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.311529 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ae662df3-8898-4509-b820-2a918ad3ad7a\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.311606 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9lw6\" (UniqueName: \"kubernetes.io/projected/ae662df3-8898-4509-b820-2a918ad3ad7a-kube-api-access-f9lw6\") pod \"ae662df3-8898-4509-b820-2a918ad3ad7a\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.311686 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-logs\") pod \"ae662df3-8898-4509-b820-2a918ad3ad7a\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.312045 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ae662df3-8898-4509-b820-2a918ad3ad7a" (UID: "ae662df3-8898-4509-b820-2a918ad3ad7a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.312773 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-logs" (OuterVolumeSpecName: "logs") pod "ae662df3-8898-4509-b820-2a918ad3ad7a" (UID: "ae662df3-8898-4509-b820-2a918ad3ad7a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.319935 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-scripts" (OuterVolumeSpecName: "scripts") pod "ae662df3-8898-4509-b820-2a918ad3ad7a" (UID: "ae662df3-8898-4509-b820-2a918ad3ad7a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.320023 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "ae662df3-8898-4509-b820-2a918ad3ad7a" (UID: "ae662df3-8898-4509-b820-2a918ad3ad7a"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.344886 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae662df3-8898-4509-b820-2a918ad3ad7a" (UID: "ae662df3-8898-4509-b820-2a918ad3ad7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.347480 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae662df3-8898-4509-b820-2a918ad3ad7a-kube-api-access-f9lw6" (OuterVolumeSpecName: "kube-api-access-f9lw6") pod "ae662df3-8898-4509-b820-2a918ad3ad7a" (UID: "ae662df3-8898-4509-b820-2a918ad3ad7a"). InnerVolumeSpecName "kube-api-access-f9lw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.361963 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ae662df3-8898-4509-b820-2a918ad3ad7a" (UID: "ae662df3-8898-4509-b820-2a918ad3ad7a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.364526 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-config-data" (OuterVolumeSpecName: "config-data") pod "ae662df3-8898-4509-b820-2a918ad3ad7a" (UID: "ae662df3-8898-4509-b820-2a918ad3ad7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.417859 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.417910 4804 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.417923 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.417971 4804 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.417987 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9lw6\" (UniqueName: \"kubernetes.io/projected/ae662df3-8898-4509-b820-2a918ad3ad7a-kube-api-access-f9lw6\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.418002 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.418013 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.418025 4804 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.447373 4804 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 17 13:46:46 crc kubenswrapper[4804]: E0217 13:46:46.475453 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 17 13:46:46 crc kubenswrapper[4804]: E0217 13:46:46.475667 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v577c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-jz9x9_openstack(19dd0c13-b898-4147-ae5f-cbc5d4915910): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:46:46 crc kubenswrapper[4804]: E0217 13:46:46.476939 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-jz9x9" podUID="19dd0c13-b898-4147-ae5f-cbc5d4915910" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.519490 4804 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.127241 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: E0217 13:46:47.129335 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-jz9x9" podUID="19dd0c13-b898-4147-ae5f-cbc5d4915910" Feb 17 13:46:47 crc kubenswrapper[4804]: E0217 13:46:47.139768 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Feb 17 13:46:47 crc kubenswrapper[4804]: E0217 13:46:47.143583 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n547h685h5c7h66fh5b5h7fhcfh689h59bh5ffh658h54bh8bh5cdh68h595hb6h5cfh57fhc9hfdh648h5f6h5d6h554h85h589h67dh5c7h5fbh5b7hdcq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8jjjd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(e5ccd477-88cd-4284-9de7-f336def1c7a1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:46:47 crc kubenswrapper[4804]: E0217 13:46:47.148710 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 17 13:46:47 crc kubenswrapper[4804]: E0217 13:46:47.148877 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n698h57dh657h585h7fh5d5h677h685hcbh86h5fdh75h5b6h689h86hd6hf5h545h55bhffh674hf6h574h5c8h79h68bhffh8dh56h555h699h656q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-45pwj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6b56868599-9s4h9_openstack(c40906b1-b78c-4e65-9c35-346626adeba3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:46:47 crc kubenswrapper[4804]: E0217 13:46:47.150815 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6b56868599-9s4h9" podUID="c40906b1-b78c-4e65-9c35-346626adeba3" Feb 17 13:46:47 crc kubenswrapper[4804]: E0217 13:46:47.185783 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 17 13:46:47 crc kubenswrapper[4804]: E0217 13:46:47.185989 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5cch54ch549h6bh85h58ch68bhd6hc8hc7h5ffh649h5b8h5d5h647h7fh74h54h56dhb7h55chb6hc4h68fh5b6hch5c6h694h86h65dh5d5hcdq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f28sx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5c54d4859c-6cf2w_openstack(429f2d90-393a-4205-9597-4a1d92dd15be): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:46:47 crc kubenswrapper[4804]: E0217 13:46:47.188355 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5c54d4859c-6cf2w" podUID="429f2d90-393a-4205-9597-4a1d92dd15be" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.232679 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.243096 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.256010 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:46:47 crc kubenswrapper[4804]: E0217 13:46:47.256402 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae662df3-8898-4509-b820-2a918ad3ad7a" containerName="glance-log" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.256416 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae662df3-8898-4509-b820-2a918ad3ad7a" containerName="glance-log" Feb 17 13:46:47 crc kubenswrapper[4804]: E0217 13:46:47.256439 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae662df3-8898-4509-b820-2a918ad3ad7a" containerName="glance-httpd" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.256445 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae662df3-8898-4509-b820-2a918ad3ad7a" containerName="glance-httpd" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.256611 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae662df3-8898-4509-b820-2a918ad3ad7a" containerName="glance-httpd" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.256633 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae662df3-8898-4509-b820-2a918ad3ad7a" containerName="glance-log" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.257547 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.262285 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.262331 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.376239 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.442065 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scvw2\" (UniqueName: \"kubernetes.io/projected/8ec519a7-9081-4341-ad6c-c81dda70bd3a-kube-api-access-scvw2\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.442251 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.442278 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.442397 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.442510 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.442542 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.442570 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.442668 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.544353 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.544412 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.544445 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.544520 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.544552 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scvw2\" (UniqueName: \"kubernetes.io/projected/8ec519a7-9081-4341-ad6c-c81dda70bd3a-kube-api-access-scvw2\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.544601 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.544621 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.544660 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.544926 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.545692 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.545914 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.550814 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.559351 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.564544 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.565683 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.574249 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scvw2\" (UniqueName: \"kubernetes.io/projected/8ec519a7-9081-4341-ad6c-c81dda70bd3a-kube-api-access-scvw2\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.582949 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.591360 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 13:46:48 crc kubenswrapper[4804]: I0217 13:46:48.588947 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae662df3-8898-4509-b820-2a918ad3ad7a" path="/var/lib/kubelet/pods/ae662df3-8898-4509-b820-2a918ad3ad7a/volumes" Feb 17 13:46:51 crc kubenswrapper[4804]: I0217 13:46:51.159683 4804 generic.go:334] "Generic (PLEG): container finished" podID="f15102ce-82ca-49c8-a069-25469380b043" containerID="4334f8c8c165dce79cf685c7b7ada0d4aa970effa853bf86402b0c64eaa765f2" exitCode=0 Feb 17 13:46:51 crc kubenswrapper[4804]: I0217 13:46:51.159771 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jltn7" event={"ID":"f15102ce-82ca-49c8-a069-25469380b043","Type":"ContainerDied","Data":"4334f8c8c165dce79cf685c7b7ada0d4aa970effa853bf86402b0c64eaa765f2"} Feb 17 13:46:54 crc kubenswrapper[4804]: I0217 13:46:54.169427 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" podUID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: i/o timeout" Feb 17 13:46:54 crc kubenswrapper[4804]: I0217 13:46:54.170512 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.009902 4804 scope.go:117] "RemoveContainer" containerID="08670815fbd1e9b758c348fb93fa962c48c717b87693dc123c304e7d0ec4d4cd" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.109240 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.122184 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.132373 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.146989 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.162323 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.203686 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b56868599-9s4h9" event={"ID":"c40906b1-b78c-4e65-9c35-346626adeba3","Type":"ContainerDied","Data":"0364542f4068f261d045963e550e809dad4e8639c09b095d8e25392e8b58c1ac"} Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.203777 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.204693 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-logs\") pod \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.204725 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42bnj\" (UniqueName: \"kubernetes.io/projected/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-kube-api-access-42bnj\") pod \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.204784 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-scripts\") pod \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.204850 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-horizon-secret-key\") pod \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.204876 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-config-data\") pod \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.206020 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-config-data" (OuterVolumeSpecName: "config-data") pod "64f8b969-63f0-4c36-baa9-e86e4b0bf0d9" (UID: "64f8b969-63f0-4c36-baa9-e86e4b0bf0d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.206005 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-scripts" (OuterVolumeSpecName: "scripts") pod "64f8b969-63f0-4c36-baa9-e86e4b0bf0d9" (UID: "64f8b969-63f0-4c36-baa9-e86e4b0bf0d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.206134 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-logs" (OuterVolumeSpecName: "logs") pod "64f8b969-63f0-4c36-baa9-e86e4b0bf0d9" (UID: "64f8b969-63f0-4c36-baa9-e86e4b0bf0d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.210317 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c54d4859c-6cf2w" event={"ID":"429f2d90-393a-4205-9597-4a1d92dd15be","Type":"ContainerDied","Data":"fce38230f6cfdb1a15c8055882ba0aa717f46a7acf9c7b94e71d76d6f0c3a6ff"} Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.210476 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.213438 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "64f8b969-63f0-4c36-baa9-e86e4b0bf0d9" (UID: "64f8b969-63f0-4c36-baa9-e86e4b0bf0d9"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.216569 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-kube-api-access-42bnj" (OuterVolumeSpecName: "kube-api-access-42bnj") pod "64f8b969-63f0-4c36-baa9-e86e4b0bf0d9" (UID: "64f8b969-63f0-4c36-baa9-e86e4b0bf0d9"). InnerVolumeSpecName "kube-api-access-42bnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.221859 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d659f57fc-rp4h6" event={"ID":"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9","Type":"ContainerDied","Data":"38398873d4e244754b25fb3ccf4b8d269e3c191649fc54d1a45b9951042bdc7f"} Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.222074 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.235364 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.235924 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jltn7" event={"ID":"f15102ce-82ca-49c8-a069-25469380b043","Type":"ContainerDied","Data":"87611d75a6337c3c8516a80920686721f7c8d0bd90cc4c5bcd9d5239128f8f14"} Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.235968 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87611d75a6337c3c8516a80920686721f7c8d0bd90cc4c5bcd9d5239128f8f14" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.239667 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" event={"ID":"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2","Type":"ContainerDied","Data":"9e9b5616dd62b1afbb31c7b84604c193960d32aba2443b3713adaf3e69d9332f"} Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.239782 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306252 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58rdh\" (UniqueName: \"kubernetes.io/projected/f15102ce-82ca-49c8-a069-25469380b043-kube-api-access-58rdh\") pod \"f15102ce-82ca-49c8-a069-25469380b043\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306315 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/429f2d90-393a-4205-9597-4a1d92dd15be-horizon-secret-key\") pod \"429f2d90-393a-4205-9597-4a1d92dd15be\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306339 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f28sx\" (UniqueName: \"kubernetes.io/projected/429f2d90-393a-4205-9597-4a1d92dd15be-kube-api-access-f28sx\") pod \"429f2d90-393a-4205-9597-4a1d92dd15be\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306413 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg5rm\" (UniqueName: \"kubernetes.io/projected/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-kube-api-access-gg5rm\") pod \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306446 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-scripts\") pod \"429f2d90-393a-4205-9597-4a1d92dd15be\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306492 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-nb\") pod \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306532 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-config\") pod \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306553 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c40906b1-b78c-4e65-9c35-346626adeba3-logs\") pod \"c40906b1-b78c-4e65-9c35-346626adeba3\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306581 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45pwj\" (UniqueName: \"kubernetes.io/projected/c40906b1-b78c-4e65-9c35-346626adeba3-kube-api-access-45pwj\") pod \"c40906b1-b78c-4e65-9c35-346626adeba3\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306611 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-scripts\") pod \"c40906b1-b78c-4e65-9c35-346626adeba3\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306643 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/429f2d90-393a-4205-9597-4a1d92dd15be-logs\") pod \"429f2d90-393a-4205-9597-4a1d92dd15be\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306661 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-config-data\") pod \"c40906b1-b78c-4e65-9c35-346626adeba3\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306689 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-svc\") pod \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306723 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-swift-storage-0\") pod \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306748 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-sb\") pod \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306791 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-combined-ca-bundle\") pod \"f15102ce-82ca-49c8-a069-25469380b043\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306815 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-config\") pod \"f15102ce-82ca-49c8-a069-25469380b043\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306835 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c40906b1-b78c-4e65-9c35-346626adeba3-horizon-secret-key\") pod \"c40906b1-b78c-4e65-9c35-346626adeba3\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306879 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-config-data\") pod \"429f2d90-393a-4205-9597-4a1d92dd15be\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.308989 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.309019 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42bnj\" (UniqueName: \"kubernetes.io/projected/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-kube-api-access-42bnj\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.309032 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.309043 4804 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.309056 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.309891 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-scripts" (OuterVolumeSpecName: "scripts") pod "c40906b1-b78c-4e65-9c35-346626adeba3" (UID: "c40906b1-b78c-4e65-9c35-346626adeba3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.309996 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-scripts" (OuterVolumeSpecName: "scripts") pod "429f2d90-393a-4205-9597-4a1d92dd15be" (UID: "429f2d90-393a-4205-9597-4a1d92dd15be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.310642 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/429f2d90-393a-4205-9597-4a1d92dd15be-logs" (OuterVolumeSpecName: "logs") pod "429f2d90-393a-4205-9597-4a1d92dd15be" (UID: "429f2d90-393a-4205-9597-4a1d92dd15be"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.311494 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c40906b1-b78c-4e65-9c35-346626adeba3-logs" (OuterVolumeSpecName: "logs") pod "c40906b1-b78c-4e65-9c35-346626adeba3" (UID: "c40906b1-b78c-4e65-9c35-346626adeba3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.311841 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-config-data" (OuterVolumeSpecName: "config-data") pod "429f2d90-393a-4205-9597-4a1d92dd15be" (UID: "429f2d90-393a-4205-9597-4a1d92dd15be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.314335 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429f2d90-393a-4205-9597-4a1d92dd15be-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "429f2d90-393a-4205-9597-4a1d92dd15be" (UID: "429f2d90-393a-4205-9597-4a1d92dd15be"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.314636 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-kube-api-access-gg5rm" (OuterVolumeSpecName: "kube-api-access-gg5rm") pod "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" (UID: "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2"). InnerVolumeSpecName "kube-api-access-gg5rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.314876 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/429f2d90-393a-4205-9597-4a1d92dd15be-kube-api-access-f28sx" (OuterVolumeSpecName: "kube-api-access-f28sx") pod "429f2d90-393a-4205-9597-4a1d92dd15be" (UID: "429f2d90-393a-4205-9597-4a1d92dd15be"). InnerVolumeSpecName "kube-api-access-f28sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.315384 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f15102ce-82ca-49c8-a069-25469380b043-kube-api-access-58rdh" (OuterVolumeSpecName: "kube-api-access-58rdh") pod "f15102ce-82ca-49c8-a069-25469380b043" (UID: "f15102ce-82ca-49c8-a069-25469380b043"). InnerVolumeSpecName "kube-api-access-58rdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.321708 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-config-data" (OuterVolumeSpecName: "config-data") pod "c40906b1-b78c-4e65-9c35-346626adeba3" (UID: "c40906b1-b78c-4e65-9c35-346626adeba3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.325590 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c40906b1-b78c-4e65-9c35-346626adeba3-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c40906b1-b78c-4e65-9c35-346626adeba3" (UID: "c40906b1-b78c-4e65-9c35-346626adeba3"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.353587 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c40906b1-b78c-4e65-9c35-346626adeba3-kube-api-access-45pwj" (OuterVolumeSpecName: "kube-api-access-45pwj") pod "c40906b1-b78c-4e65-9c35-346626adeba3" (UID: "c40906b1-b78c-4e65-9c35-346626adeba3"). InnerVolumeSpecName "kube-api-access-45pwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.355594 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-d659f57fc-rp4h6"] Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.367903 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f15102ce-82ca-49c8-a069-25469380b043" (UID: "f15102ce-82ca-49c8-a069-25469380b043"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.372833 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-d659f57fc-rp4h6"] Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.384254 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" (UID: "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.385027 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-config" (OuterVolumeSpecName: "config") pod "f15102ce-82ca-49c8-a069-25469380b043" (UID: "f15102ce-82ca-49c8-a069-25469380b043"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.391344 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" (UID: "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.394636 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" (UID: "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.396883 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-config" (OuterVolumeSpecName: "config") pod "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" (UID: "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.397957 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" (UID: "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.410913 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.410969 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58rdh\" (UniqueName: \"kubernetes.io/projected/f15102ce-82ca-49c8-a069-25469380b043-kube-api-access-58rdh\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.410984 4804 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/429f2d90-393a-4205-9597-4a1d92dd15be-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.410995 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f28sx\" (UniqueName: \"kubernetes.io/projected/429f2d90-393a-4205-9597-4a1d92dd15be-kube-api-access-f28sx\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411007 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg5rm\" (UniqueName: \"kubernetes.io/projected/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-kube-api-access-gg5rm\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411019 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411029 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411040 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411050 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c40906b1-b78c-4e65-9c35-346626adeba3-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411059 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45pwj\" (UniqueName: \"kubernetes.io/projected/c40906b1-b78c-4e65-9c35-346626adeba3-kube-api-access-45pwj\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411069 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411077 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/429f2d90-393a-4205-9597-4a1d92dd15be-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411084 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411092 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411099 4804 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411109 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411116 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411152 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411160 4804 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c40906b1-b78c-4e65-9c35-346626adeba3-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.613740 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6b56868599-9s4h9"] Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.635645 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6b56868599-9s4h9"] Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.656258 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c54d4859c-6cf2w"] Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.662180 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5c54d4859c-6cf2w"] Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.667736 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-hp2db"] Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.673046 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-hp2db"] Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.494537 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-k5l98"] Feb 17 13:46:56 crc kubenswrapper[4804]: E0217 13:46:56.496117 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerName="init" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.496140 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerName="init" Feb 17 13:46:56 crc kubenswrapper[4804]: E0217 13:46:56.496270 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f15102ce-82ca-49c8-a069-25469380b043" containerName="neutron-db-sync" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.496745 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f15102ce-82ca-49c8-a069-25469380b043" containerName="neutron-db-sync" Feb 17 13:46:56 crc kubenswrapper[4804]: E0217 13:46:56.496808 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerName="dnsmasq-dns" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.496821 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerName="dnsmasq-dns" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.497188 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f15102ce-82ca-49c8-a069-25469380b043" containerName="neutron-db-sync" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.497233 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerName="dnsmasq-dns" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.498432 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.517186 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-k5l98"] Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.585892 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="429f2d90-393a-4205-9597-4a1d92dd15be" path="/var/lib/kubelet/pods/429f2d90-393a-4205-9597-4a1d92dd15be/volumes" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.586466 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64f8b969-63f0-4c36-baa9-e86e4b0bf0d9" path="/var/lib/kubelet/pods/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9/volumes" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.587020 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c40906b1-b78c-4e65-9c35-346626adeba3" path="/var/lib/kubelet/pods/c40906b1-b78c-4e65-9c35-346626adeba3/volumes" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.587536 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" path="/var/lib/kubelet/pods/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2/volumes" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.629167 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-547f989fd6-rqkvc"] Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.630559 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.632926 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.633911 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.634078 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.635195 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mckmx" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.640630 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-547f989fd6-rqkvc"] Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.646710 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.646759 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.646823 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr4tf\" (UniqueName: \"kubernetes.io/projected/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-kube-api-access-gr4tf\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.646855 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-svc\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.646908 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.646937 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-config\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.748224 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.748507 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.748559 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x29pc\" (UniqueName: \"kubernetes.io/projected/a2f2352e-7e9b-439f-be3c-b48b70681658-kube-api-access-x29pc\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.748591 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-config\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.748622 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr4tf\" (UniqueName: \"kubernetes.io/projected/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-kube-api-access-gr4tf\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.748662 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-svc\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.748736 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.748767 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-ovndb-tls-certs\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.748793 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-config\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.748827 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-httpd-config\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.748842 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-combined-ca-bundle\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.749500 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.750116 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-config\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.750833 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.751181 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-svc\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.751611 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.772099 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr4tf\" (UniqueName: \"kubernetes.io/projected/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-kube-api-access-gr4tf\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.850560 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-httpd-config\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.850611 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-combined-ca-bundle\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.850684 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x29pc\" (UniqueName: \"kubernetes.io/projected/a2f2352e-7e9b-439f-be3c-b48b70681658-kube-api-access-x29pc\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.850714 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-config\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.850805 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-ovndb-tls-certs\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.854441 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-httpd-config\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.854466 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-ovndb-tls-certs\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.854807 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-combined-ca-bundle\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.855086 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-config\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.860570 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.869620 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x29pc\" (UniqueName: \"kubernetes.io/projected/a2f2352e-7e9b-439f-be3c-b48b70681658-kube-api-access-x29pc\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.955161 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:57 crc kubenswrapper[4804]: E0217 13:46:57.009572 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 17 13:46:57 crc kubenswrapper[4804]: E0217 13:46:57.009727 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-trmx2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-f9zkj_openstack(02a921c8-6579-451b-beaf-9832cf900668): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:46:57 crc kubenswrapper[4804]: E0217 13:46:57.010947 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-f9zkj" podUID="02a921c8-6579-451b-beaf-9832cf900668" Feb 17 13:46:57 crc kubenswrapper[4804]: E0217 13:46:57.274800 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-f9zkj" podUID="02a921c8-6579-451b-beaf-9832cf900668" Feb 17 13:46:57 crc kubenswrapper[4804]: I0217 13:46:57.389359 4804 scope.go:117] "RemoveContainer" containerID="d82f8e60d688c7c01688fbedc29bdcd643db8c569309612415da050dc9220d5f" Feb 17 13:46:57 crc kubenswrapper[4804]: I0217 13:46:57.540932 4804 scope.go:117] "RemoveContainer" containerID="db5a3b86c0d8b3db5d6271f9217c22ad17bdcc258a7e248a0fd7a959c200bb06" Feb 17 13:46:57 crc kubenswrapper[4804]: I0217 13:46:57.620641 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:46:57 crc kubenswrapper[4804]: I0217 13:46:57.813392 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7kgzk"] Feb 17 13:46:57 crc kubenswrapper[4804]: W0217 13:46:57.826615 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96609ec5_c9e0_4611_85ff_f7dc474d889a.slice/crio-7d54b7b327f8ebe87ac5109269b506e672be06f69b6bfb3774f552c367900af0 WatchSource:0}: Error finding container 7d54b7b327f8ebe87ac5109269b506e672be06f69b6bfb3774f552c367900af0: Status 404 returned error can't find the container with id 7d54b7b327f8ebe87ac5109269b506e672be06f69b6bfb3774f552c367900af0 Feb 17 13:46:57 crc kubenswrapper[4804]: I0217 13:46:57.942406 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58989b55cb-zjfvf"] Feb 17 13:46:57 crc kubenswrapper[4804]: I0217 13:46:57.987012 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9ffb6f5c6-fczv5"] Feb 17 13:46:57 crc kubenswrapper[4804]: W0217 13:46:57.993791 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85415d6a_8a5f_4b65_b182_2bfe221e8eee.slice/crio-70f335cc0aa83fd894a693104e67ff9d41e07158faf0aa4fa4d67a39b59c2aa3 WatchSource:0}: Error finding container 70f335cc0aa83fd894a693104e67ff9d41e07158faf0aa4fa4d67a39b59c2aa3: Status 404 returned error can't find the container with id 70f335cc0aa83fd894a693104e67ff9d41e07158faf0aa4fa4d67a39b59c2aa3 Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.149631 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:46:58 crc kubenswrapper[4804]: W0217 13:46:58.161819 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ec519a7_9081_4341_ad6c_c81dda70bd3a.slice/crio-ba1329d4b79ba312c3f527f0d612e3cd76cb2acbaa7d0c300741c813abd79d36 WatchSource:0}: Error finding container ba1329d4b79ba312c3f527f0d612e3cd76cb2acbaa7d0c300741c813abd79d36: Status 404 returned error can't find the container with id ba1329d4b79ba312c3f527f0d612e3cd76cb2acbaa7d0c300741c813abd79d36 Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.193712 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-k5l98"] Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.275996 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-547f989fd6-rqkvc"] Feb 17 13:46:58 crc kubenswrapper[4804]: W0217 13:46:58.283702 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2f2352e_7e9b_439f_be3c_b48b70681658.slice/crio-f722d26b35de998b04775d73a392cd120313a641cde842ae74275d679995720d WatchSource:0}: Error finding container f722d26b35de998b04775d73a392cd120313a641cde842ae74275d679995720d: Status 404 returned error can't find the container with id f722d26b35de998b04775d73a392cd120313a641cde842ae74275d679995720d Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.285834 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7kgzk" event={"ID":"96609ec5-c9e0-4611-85ff-f7dc474d889a","Type":"ContainerStarted","Data":"604b9ee7bde95746f49c889a56552a71b595a4b833acc7e18a46ed3d41181f64"} Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.285868 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7kgzk" event={"ID":"96609ec5-c9e0-4611-85ff-f7dc474d889a","Type":"ContainerStarted","Data":"7d54b7b327f8ebe87ac5109269b506e672be06f69b6bfb3774f552c367900af0"} Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.293192 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5ccd477-88cd-4284-9de7-f336def1c7a1","Type":"ContainerStarted","Data":"c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67"} Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.311110 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ec519a7-9081-4341-ad6c-c81dda70bd3a","Type":"ContainerStarted","Data":"ba1329d4b79ba312c3f527f0d612e3cd76cb2acbaa7d0c300741c813abd79d36"} Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.315245 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xf9m6" event={"ID":"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d","Type":"ContainerStarted","Data":"872cca29de0693ae54523b0b283408b6320b6200ca8ba4e549db427f9a5d561e"} Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.318509 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-7kgzk" podStartSLOduration=24.318489637 podStartE2EDuration="24.318489637s" podCreationTimestamp="2026-02-17 13:46:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:46:58.307628395 +0000 UTC m=+1292.419047732" watchObservedRunningTime="2026-02-17 13:46:58.318489637 +0000 UTC m=+1292.429908984" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.321579 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58989b55cb-zjfvf" event={"ID":"85415d6a-8a5f-4b65-b182-2bfe221e8eee","Type":"ContainerStarted","Data":"70f335cc0aa83fd894a693104e67ff9d41e07158faf0aa4fa4d67a39b59c2aa3"} Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.327569 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9ffb6f5c6-fczv5" event={"ID":"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f","Type":"ContainerStarted","Data":"efcc6251e72926fc927c924715b8a426c728584d969b1b7a63b8d304f8f0c323"} Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.346145 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a","Type":"ContainerStarted","Data":"b085a946a0d0c5dd1859aecc784b43e603e7ed1f79fe7a947c4f1b01db4b14a2"} Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.350296 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-xf9m6" podStartSLOduration=5.560752154 podStartE2EDuration="33.350278835s" podCreationTimestamp="2026-02-17 13:46:25 +0000 UTC" firstStartedPulling="2026-02-17 13:46:27.20337636 +0000 UTC m=+1261.314795697" lastFinishedPulling="2026-02-17 13:46:54.992903041 +0000 UTC m=+1289.104322378" observedRunningTime="2026-02-17 13:46:58.333962483 +0000 UTC m=+1292.445381830" watchObservedRunningTime="2026-02-17 13:46:58.350278835 +0000 UTC m=+1292.461698172" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.354093 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" event={"ID":"df6e7376-a420-4a04-abf8-ab5bc3f76d7c","Type":"ContainerStarted","Data":"acc1d16ca31ae16b95fd7513bacd065031f5a80799a0b49cb8f97e1864a0396a"} Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.531610 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-77797bd57-r2gff"] Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.533548 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.540532 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77797bd57-r2gff"] Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.545625 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.545676 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.698503 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-httpd-config\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.698543 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-ovndb-tls-certs\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.698617 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-internal-tls-certs\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.698680 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-combined-ca-bundle\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.698711 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-config\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.698765 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6642c\" (UniqueName: \"kubernetes.io/projected/3dd4a1b7-336a-4b57-a341-a413ccd8a223-kube-api-access-6642c\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.698796 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-public-tls-certs\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.800995 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-internal-tls-certs\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.801331 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-combined-ca-bundle\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.801371 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-config\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.801426 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6642c\" (UniqueName: \"kubernetes.io/projected/3dd4a1b7-336a-4b57-a341-a413ccd8a223-kube-api-access-6642c\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.801455 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-public-tls-certs\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.801480 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-httpd-config\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.801503 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-ovndb-tls-certs\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.809870 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-internal-tls-certs\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.809902 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-httpd-config\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.810567 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-config\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.811338 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-public-tls-certs\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.811887 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-combined-ca-bundle\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.815893 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-ovndb-tls-certs\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.819871 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6642c\" (UniqueName: \"kubernetes.io/projected/3dd4a1b7-336a-4b57-a341-a413ccd8a223-kube-api-access-6642c\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.988104 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.171318 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" podUID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: i/o timeout" Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.375085 4804 generic.go:334] "Generic (PLEG): container finished" podID="df6e7376-a420-4a04-abf8-ab5bc3f76d7c" containerID="d94c4e75462da55817e78fa331ff2dfa0d9f3a19457e74a4ec47a997d6a72b47" exitCode=0 Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.375224 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" event={"ID":"df6e7376-a420-4a04-abf8-ab5bc3f76d7c","Type":"ContainerDied","Data":"d94c4e75462da55817e78fa331ff2dfa0d9f3a19457e74a4ec47a997d6a72b47"} Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.382452 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jz9x9" event={"ID":"19dd0c13-b898-4147-ae5f-cbc5d4915910","Type":"ContainerStarted","Data":"ac639ef1a9c58b32b3d0b2c6ada8a7a2aab1ce08a075bd944173f5c820ec7cfc"} Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.405946 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ec519a7-9081-4341-ad6c-c81dda70bd3a","Type":"ContainerStarted","Data":"3c023d82e32da3e66e3f80b40ff960f9faffbfd6b13149e23d95974790def49f"} Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.414411 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-jz9x9" podStartSLOduration=3.517771708 podStartE2EDuration="34.414377204s" podCreationTimestamp="2026-02-17 13:46:25 +0000 UTC" firstStartedPulling="2026-02-17 13:46:27.142357433 +0000 UTC m=+1261.253776770" lastFinishedPulling="2026-02-17 13:46:58.038962929 +0000 UTC m=+1292.150382266" observedRunningTime="2026-02-17 13:46:59.409667996 +0000 UTC m=+1293.521087333" watchObservedRunningTime="2026-02-17 13:46:59.414377204 +0000 UTC m=+1293.525796541" Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.417606 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9ffb6f5c6-fczv5" event={"ID":"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f","Type":"ContainerStarted","Data":"3c7dfb1330433a8d2839e256e243dd44b22edc10229d43afe0d5574fd17b96aa"} Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.423674 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547f989fd6-rqkvc" event={"ID":"a2f2352e-7e9b-439f-be3c-b48b70681658","Type":"ContainerStarted","Data":"60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad"} Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.423715 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547f989fd6-rqkvc" event={"ID":"a2f2352e-7e9b-439f-be3c-b48b70681658","Type":"ContainerStarted","Data":"8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef"} Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.423749 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547f989fd6-rqkvc" event={"ID":"a2f2352e-7e9b-439f-be3c-b48b70681658","Type":"ContainerStarted","Data":"f722d26b35de998b04775d73a392cd120313a641cde842ae74275d679995720d"} Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.424849 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.428511 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a","Type":"ContainerStarted","Data":"96d29bd83b497a761b451864a140d1abcb104cfcfced732b3dc36a76cf94eca1"} Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.455443 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-547f989fd6-rqkvc" podStartSLOduration=3.455425044 podStartE2EDuration="3.455425044s" podCreationTimestamp="2026-02-17 13:46:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:46:59.443578422 +0000 UTC m=+1293.554997759" watchObservedRunningTime="2026-02-17 13:46:59.455425044 +0000 UTC m=+1293.566844381" Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.685852 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77797bd57-r2gff"] Feb 17 13:46:59 crc kubenswrapper[4804]: W0217 13:46:59.699270 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dd4a1b7_336a_4b57_a341_a413ccd8a223.slice/crio-b27dcb323e9a77c57f04bfd3aad2ceaaa35b5cea105117b952a32b3cda64f464 WatchSource:0}: Error finding container b27dcb323e9a77c57f04bfd3aad2ceaaa35b5cea105117b952a32b3cda64f464: Status 404 returned error can't find the container with id b27dcb323e9a77c57f04bfd3aad2ceaaa35b5cea105117b952a32b3cda64f464 Feb 17 13:47:00 crc kubenswrapper[4804]: I0217 13:47:00.438126 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77797bd57-r2gff" event={"ID":"3dd4a1b7-336a-4b57-a341-a413ccd8a223","Type":"ContainerStarted","Data":"b27dcb323e9a77c57f04bfd3aad2ceaaa35b5cea105117b952a32b3cda64f464"} Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.449377 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" event={"ID":"df6e7376-a420-4a04-abf8-ab5bc3f76d7c","Type":"ContainerStarted","Data":"49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968"} Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.449989 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.452022 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ec519a7-9081-4341-ad6c-c81dda70bd3a","Type":"ContainerStarted","Data":"74e9c41b66fa02c3d94931a9817572fd799183a1707f607e072d3c3dddd9e96b"} Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.453650 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58989b55cb-zjfvf" event={"ID":"85415d6a-8a5f-4b65-b182-2bfe221e8eee","Type":"ContainerStarted","Data":"d85afc401ad87104d844d4c1c5c56bfe2224eb996820680ca9a6f48ab88469e3"} Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.455229 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77797bd57-r2gff" event={"ID":"3dd4a1b7-336a-4b57-a341-a413ccd8a223","Type":"ContainerStarted","Data":"c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef"} Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.456899 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9ffb6f5c6-fczv5" event={"ID":"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f","Type":"ContainerStarted","Data":"c98e2e3f274ed3b6712ded3d6d3a0315988fce20ffc58176db8f6fc0c39cdb28"} Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.459698 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a","Type":"ContainerStarted","Data":"d76559a73775110d3ee8468599e5e90ccd7d9508465099de17a00f9e4f56fb8f"} Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.459892 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" containerName="glance-log" containerID="cri-o://96d29bd83b497a761b451864a140d1abcb104cfcfced732b3dc36a76cf94eca1" gracePeriod=30 Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.459928 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" containerName="glance-httpd" containerID="cri-o://d76559a73775110d3ee8468599e5e90ccd7d9508465099de17a00f9e4f56fb8f" gracePeriod=30 Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.473936 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" podStartSLOduration=5.473920572 podStartE2EDuration="5.473920572s" podCreationTimestamp="2026-02-17 13:46:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:01.468991128 +0000 UTC m=+1295.580410465" watchObservedRunningTime="2026-02-17 13:47:01.473920572 +0000 UTC m=+1295.585339909" Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.505422 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=14.505405872 podStartE2EDuration="14.505405872s" podCreationTimestamp="2026-02-17 13:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:01.492759155 +0000 UTC m=+1295.604178492" watchObservedRunningTime="2026-02-17 13:47:01.505405872 +0000 UTC m=+1295.616825209" Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.532738 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-9ffb6f5c6-fczv5" podStartSLOduration=26.975343411 podStartE2EDuration="27.532716601s" podCreationTimestamp="2026-02-17 13:46:34 +0000 UTC" firstStartedPulling="2026-02-17 13:46:57.996167815 +0000 UTC m=+1292.107587152" lastFinishedPulling="2026-02-17 13:46:58.553541015 +0000 UTC m=+1292.664960342" observedRunningTime="2026-02-17 13:47:01.513477936 +0000 UTC m=+1295.624897273" watchObservedRunningTime="2026-02-17 13:47:01.532716601 +0000 UTC m=+1295.644135938" Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.535942 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=27.535924412 podStartE2EDuration="27.535924412s" podCreationTimestamp="2026-02-17 13:46:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:01.532088512 +0000 UTC m=+1295.643507859" watchObservedRunningTime="2026-02-17 13:47:01.535924412 +0000 UTC m=+1295.647343749" Feb 17 13:47:02 crc kubenswrapper[4804]: I0217 13:47:02.470559 4804 generic.go:334] "Generic (PLEG): container finished" podID="14e1fc7b-0e6c-4377-b4e0-74e77e951b0d" containerID="872cca29de0693ae54523b0b283408b6320b6200ca8ba4e549db427f9a5d561e" exitCode=0 Feb 17 13:47:02 crc kubenswrapper[4804]: I0217 13:47:02.470638 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xf9m6" event={"ID":"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d","Type":"ContainerDied","Data":"872cca29de0693ae54523b0b283408b6320b6200ca8ba4e549db427f9a5d561e"} Feb 17 13:47:02 crc kubenswrapper[4804]: I0217 13:47:02.473125 4804 generic.go:334] "Generic (PLEG): container finished" podID="9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" containerID="d76559a73775110d3ee8468599e5e90ccd7d9508465099de17a00f9e4f56fb8f" exitCode=0 Feb 17 13:47:02 crc kubenswrapper[4804]: I0217 13:47:02.473158 4804 generic.go:334] "Generic (PLEG): container finished" podID="9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" containerID="96d29bd83b497a761b451864a140d1abcb104cfcfced732b3dc36a76cf94eca1" exitCode=143 Feb 17 13:47:02 crc kubenswrapper[4804]: I0217 13:47:02.473186 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a","Type":"ContainerDied","Data":"d76559a73775110d3ee8468599e5e90ccd7d9508465099de17a00f9e4f56fb8f"} Feb 17 13:47:02 crc kubenswrapper[4804]: I0217 13:47:02.473298 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a","Type":"ContainerDied","Data":"96d29bd83b497a761b451864a140d1abcb104cfcfced732b3dc36a76cf94eca1"} Feb 17 13:47:02 crc kubenswrapper[4804]: I0217 13:47:02.475076 4804 generic.go:334] "Generic (PLEG): container finished" podID="96609ec5-c9e0-4611-85ff-f7dc474d889a" containerID="604b9ee7bde95746f49c889a56552a71b595a4b833acc7e18a46ed3d41181f64" exitCode=0 Feb 17 13:47:02 crc kubenswrapper[4804]: I0217 13:47:02.475224 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7kgzk" event={"ID":"96609ec5-c9e0-4611-85ff-f7dc474d889a","Type":"ContainerDied","Data":"604b9ee7bde95746f49c889a56552a71b595a4b833acc7e18a46ed3d41181f64"} Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.396622 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.396897 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.520999 4804 generic.go:334] "Generic (PLEG): container finished" podID="19dd0c13-b898-4147-ae5f-cbc5d4915910" containerID="ac639ef1a9c58b32b3d0b2c6ada8a7a2aab1ce08a075bd944173f5c820ec7cfc" exitCode=0 Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.521046 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jz9x9" event={"ID":"19dd0c13-b898-4147-ae5f-cbc5d4915910","Type":"ContainerDied","Data":"ac639ef1a9c58b32b3d0b2c6ada8a7a2aab1ce08a075bd944173f5c820ec7cfc"} Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.741087 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.753639 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xf9m6" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.824888 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snn6s\" (UniqueName: \"kubernetes.io/projected/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-kube-api-access-snn6s\") pod \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.824946 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-combined-ca-bundle\") pod \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.824982 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-combined-ca-bundle\") pod \"96609ec5-c9e0-4611-85ff-f7dc474d889a\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.825019 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-scripts\") pod \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.825052 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-credential-keys\") pod \"96609ec5-c9e0-4611-85ff-f7dc474d889a\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.825117 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-config-data\") pod \"96609ec5-c9e0-4611-85ff-f7dc474d889a\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.825183 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-fernet-keys\") pod \"96609ec5-c9e0-4611-85ff-f7dc474d889a\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.825236 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-config-data\") pod \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.825282 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhl8p\" (UniqueName: \"kubernetes.io/projected/96609ec5-c9e0-4611-85ff-f7dc474d889a-kube-api-access-rhl8p\") pod \"96609ec5-c9e0-4611-85ff-f7dc474d889a\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.825324 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-logs\") pod \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.825359 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-scripts\") pod \"96609ec5-c9e0-4611-85ff-f7dc474d889a\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.828603 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-logs" (OuterVolumeSpecName: "logs") pod "14e1fc7b-0e6c-4377-b4e0-74e77e951b0d" (UID: "14e1fc7b-0e6c-4377-b4e0-74e77e951b0d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.833293 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-scripts" (OuterVolumeSpecName: "scripts") pod "14e1fc7b-0e6c-4377-b4e0-74e77e951b0d" (UID: "14e1fc7b-0e6c-4377-b4e0-74e77e951b0d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.843370 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "96609ec5-c9e0-4611-85ff-f7dc474d889a" (UID: "96609ec5-c9e0-4611-85ff-f7dc474d889a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.843444 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "96609ec5-c9e0-4611-85ff-f7dc474d889a" (UID: "96609ec5-c9e0-4611-85ff-f7dc474d889a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.843447 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96609ec5-c9e0-4611-85ff-f7dc474d889a-kube-api-access-rhl8p" (OuterVolumeSpecName: "kube-api-access-rhl8p") pod "96609ec5-c9e0-4611-85ff-f7dc474d889a" (UID: "96609ec5-c9e0-4611-85ff-f7dc474d889a"). InnerVolumeSpecName "kube-api-access-rhl8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.847843 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-scripts" (OuterVolumeSpecName: "scripts") pod "96609ec5-c9e0-4611-85ff-f7dc474d889a" (UID: "96609ec5-c9e0-4611-85ff-f7dc474d889a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.847953 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-kube-api-access-snn6s" (OuterVolumeSpecName: "kube-api-access-snn6s") pod "14e1fc7b-0e6c-4377-b4e0-74e77e951b0d" (UID: "14e1fc7b-0e6c-4377-b4e0-74e77e951b0d"). InnerVolumeSpecName "kube-api-access-snn6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.867313 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-config-data" (OuterVolumeSpecName: "config-data") pod "14e1fc7b-0e6c-4377-b4e0-74e77e951b0d" (UID: "14e1fc7b-0e6c-4377-b4e0-74e77e951b0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.877610 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-config-data" (OuterVolumeSpecName: "config-data") pod "96609ec5-c9e0-4611-85ff-f7dc474d889a" (UID: "96609ec5-c9e0-4611-85ff-f7dc474d889a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.878810 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.878892 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.899830 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96609ec5-c9e0-4611-85ff-f7dc474d889a" (UID: "96609ec5-c9e0-4611-85ff-f7dc474d889a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.906547 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14e1fc7b-0e6c-4377-b4e0-74e77e951b0d" (UID: "14e1fc7b-0e6c-4377-b4e0-74e77e951b0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.936471 4804 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.936510 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.936525 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhl8p\" (UniqueName: \"kubernetes.io/projected/96609ec5-c9e0-4611-85ff-f7dc474d889a-kube-api-access-rhl8p\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.936539 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.936548 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.936560 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snn6s\" (UniqueName: \"kubernetes.io/projected/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-kube-api-access-snn6s\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.936570 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.936580 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.936591 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.936603 4804 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.936615 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.119257 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.245937 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-combined-ca-bundle\") pod \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.246009 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-public-tls-certs\") pod \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.246034 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.246056 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-scripts\") pod \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.246075 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-config-data\") pod \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.246111 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrhpb\" (UniqueName: \"kubernetes.io/projected/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-kube-api-access-rrhpb\") pod \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.246135 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-httpd-run\") pod \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.246246 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-logs\") pod \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.246987 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-logs" (OuterVolumeSpecName: "logs") pod "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" (UID: "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.250598 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" (UID: "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.250812 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-scripts" (OuterVolumeSpecName: "scripts") pod "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" (UID: "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.266492 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-kube-api-access-rrhpb" (OuterVolumeSpecName: "kube-api-access-rrhpb") pod "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" (UID: "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a"). InnerVolumeSpecName "kube-api-access-rrhpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.272037 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" (UID: "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.295514 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" (UID: "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.313752 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" (UID: "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.319376 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-config-data" (OuterVolumeSpecName: "config-data") pod "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" (UID: "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.348244 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrhpb\" (UniqueName: \"kubernetes.io/projected/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-kube-api-access-rrhpb\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.348277 4804 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.348302 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.348316 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.348325 4804 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.348358 4804 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.348382 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.348391 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.366588 4804 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.450325 4804 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.533666 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xf9m6" event={"ID":"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d","Type":"ContainerDied","Data":"ff3a70443dcb1450d10696c24e8f78a38b035a2406676a888b6d8c4f0796ab75"} Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.533710 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff3a70443dcb1450d10696c24e8f78a38b035a2406676a888b6d8c4f0796ab75" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.534113 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xf9m6" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.535864 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58989b55cb-zjfvf" event={"ID":"85415d6a-8a5f-4b65-b182-2bfe221e8eee","Type":"ContainerStarted","Data":"c565845aca9ef2b15231e4cf93626b2f7262c579528562e984d56c20dda93983"} Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.544910 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77797bd57-r2gff" event={"ID":"3dd4a1b7-336a-4b57-a341-a413ccd8a223","Type":"ContainerStarted","Data":"b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c"} Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.545064 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.549031 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a","Type":"ContainerDied","Data":"b085a946a0d0c5dd1859aecc784b43e603e7ed1f79fe7a947c4f1b01db4b14a2"} Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.549081 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.549121 4804 scope.go:117] "RemoveContainer" containerID="d76559a73775110d3ee8468599e5e90ccd7d9508465099de17a00f9e4f56fb8f" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.550749 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.550761 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7kgzk" event={"ID":"96609ec5-c9e0-4611-85ff-f7dc474d889a","Type":"ContainerDied","Data":"7d54b7b327f8ebe87ac5109269b506e672be06f69b6bfb3774f552c367900af0"} Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.550848 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d54b7b327f8ebe87ac5109269b506e672be06f69b6bfb3774f552c367900af0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.563267 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-58989b55cb-zjfvf" podStartSLOduration=30.614830103 podStartE2EDuration="31.563244305s" podCreationTimestamp="2026-02-17 13:46:34 +0000 UTC" firstStartedPulling="2026-02-17 13:46:57.995752222 +0000 UTC m=+1292.107171559" lastFinishedPulling="2026-02-17 13:46:58.944166424 +0000 UTC m=+1293.055585761" observedRunningTime="2026-02-17 13:47:05.56085359 +0000 UTC m=+1299.672272927" watchObservedRunningTime="2026-02-17 13:47:05.563244305 +0000 UTC m=+1299.674663662" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.563659 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5ccd477-88cd-4284-9de7-f336def1c7a1","Type":"ContainerStarted","Data":"c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5"} Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.583872 4804 scope.go:117] "RemoveContainer" containerID="96d29bd83b497a761b451864a140d1abcb104cfcfced732b3dc36a76cf94eca1" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.604971 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-77797bd57-r2gff" podStartSLOduration=7.604950856 podStartE2EDuration="7.604950856s" podCreationTimestamp="2026-02-17 13:46:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:05.591148962 +0000 UTC m=+1299.702568299" watchObservedRunningTime="2026-02-17 13:47:05.604950856 +0000 UTC m=+1299.716370193" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.634380 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.694452 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.732819 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:47:05 crc kubenswrapper[4804]: E0217 13:47:05.733227 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96609ec5-c9e0-4611-85ff-f7dc474d889a" containerName="keystone-bootstrap" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.733241 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="96609ec5-c9e0-4611-85ff-f7dc474d889a" containerName="keystone-bootstrap" Feb 17 13:47:05 crc kubenswrapper[4804]: E0217 13:47:05.733255 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" containerName="glance-httpd" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.733260 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" containerName="glance-httpd" Feb 17 13:47:05 crc kubenswrapper[4804]: E0217 13:47:05.733278 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" containerName="glance-log" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.733284 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" containerName="glance-log" Feb 17 13:47:05 crc kubenswrapper[4804]: E0217 13:47:05.733300 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14e1fc7b-0e6c-4377-b4e0-74e77e951b0d" containerName="placement-db-sync" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.733306 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="14e1fc7b-0e6c-4377-b4e0-74e77e951b0d" containerName="placement-db-sync" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.733476 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" containerName="glance-log" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.733489 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" containerName="glance-httpd" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.733499 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="96609ec5-c9e0-4611-85ff-f7dc474d889a" containerName="keystone-bootstrap" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.733510 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="14e1fc7b-0e6c-4377-b4e0-74e77e951b0d" containerName="placement-db-sync" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.734389 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.737161 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.737252 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.742625 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.868068 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-9cc757857-wng6k"] Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.869753 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.880040 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.881050 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.881249 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.881305 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.881462 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.881522 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2fq28" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.884486 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.884553 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.884579 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.884596 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-config-data\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.884645 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.884671 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-logs\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.884690 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9djck\" (UniqueName: \"kubernetes.io/projected/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-kube-api-access-9djck\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.884710 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-scripts\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.919305 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9cc757857-wng6k"] Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.985089 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-67b49bc6f6-6kg64"] Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.986964 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.987051 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-credential-keys\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.987102 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-public-tls-certs\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.987229 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-scripts\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.987254 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-internal-tls-certs\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.987457 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.987546 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf547\" (UniqueName: \"kubernetes.io/projected/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-kube-api-access-xf547\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.987592 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.987623 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-config-data\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.987959 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.988753 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-fernet-keys\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:05.992976 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:05.993917 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-combined-ca-bundle\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:05.994093 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:05.994124 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-config-data\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:05.994217 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-logs\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:05.994359 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9djck\" (UniqueName: \"kubernetes.io/projected/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-kube-api-access-9djck\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:05.994418 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-scripts\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:05.997934 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:05.998319 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-logs\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.005424 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-scripts\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.012949 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-config-data\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.022552 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.040366 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-67b49bc6f6-6kg64"] Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.040648 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.043794 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.044008 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.044183 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.045524 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.047936 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-dr6jm" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.061836 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9djck\" (UniqueName: \"kubernetes.io/projected/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-kube-api-access-9djck\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.073547 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.089131 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.100355 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-public-tls-certs\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.100421 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-scripts\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.100451 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-internal-tls-certs\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.100515 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf547\" (UniqueName: \"kubernetes.io/projected/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-kube-api-access-xf547\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.100562 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-fernet-keys\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.100607 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-combined-ca-bundle\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.100659 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-config-data\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.100759 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-credential-keys\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.111853 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-credential-keys\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.112647 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-scripts\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.113243 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-public-tls-certs\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.121728 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-internal-tls-certs\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.125510 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-config-data\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.127745 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-combined-ca-bundle\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.129758 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-fernet-keys\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.136986 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf547\" (UniqueName: \"kubernetes.io/projected/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-kube-api-access-xf547\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.200652 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.207629 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-combined-ca-bundle\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.207727 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsgq9\" (UniqueName: \"kubernetes.io/projected/8c441055-8615-497e-8754-d107b3be24c7-kube-api-access-zsgq9\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.207921 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-public-tls-certs\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.208000 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-scripts\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.208051 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-config-data\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.208232 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-internal-tls-certs\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.208335 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c441055-8615-497e-8754-d107b3be24c7-logs\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.251979 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.255991 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6d69649784-lnwhw"] Feb 17 13:47:06 crc kubenswrapper[4804]: E0217 13:47:06.256481 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19dd0c13-b898-4147-ae5f-cbc5d4915910" containerName="barbican-db-sync" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.256499 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="19dd0c13-b898-4147-ae5f-cbc5d4915910" containerName="barbican-db-sync" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.256773 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="19dd0c13-b898-4147-ae5f-cbc5d4915910" containerName="barbican-db-sync" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.257941 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.275453 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d69649784-lnwhw"] Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.310516 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-internal-tls-certs\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.312519 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c441055-8615-497e-8754-d107b3be24c7-logs\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.312823 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-combined-ca-bundle\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.312978 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsgq9\" (UniqueName: \"kubernetes.io/projected/8c441055-8615-497e-8754-d107b3be24c7-kube-api-access-zsgq9\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.313302 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-public-tls-certs\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.314431 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c441055-8615-497e-8754-d107b3be24c7-logs\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.315434 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-scripts\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.316012 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-config-data\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.318887 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-public-tls-certs\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.319041 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-combined-ca-bundle\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.320788 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-scripts\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.320998 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-config-data\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.329605 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-internal-tls-certs\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.334883 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsgq9\" (UniqueName: \"kubernetes.io/projected/8c441055-8615-497e-8754-d107b3be24c7-kube-api-access-zsgq9\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.420993 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-combined-ca-bundle\") pod \"19dd0c13-b898-4147-ae5f-cbc5d4915910\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.421083 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v577c\" (UniqueName: \"kubernetes.io/projected/19dd0c13-b898-4147-ae5f-cbc5d4915910-kube-api-access-v577c\") pod \"19dd0c13-b898-4147-ae5f-cbc5d4915910\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.421115 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-db-sync-config-data\") pod \"19dd0c13-b898-4147-ae5f-cbc5d4915910\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.421344 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkl6h\" (UniqueName: \"kubernetes.io/projected/858d67cb-268b-4724-bba9-a7ab9a10ed6c-kube-api-access-kkl6h\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.421434 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-internal-tls-certs\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.421520 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-public-tls-certs\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.421560 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-config-data\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.421590 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-scripts\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.421633 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/858d67cb-268b-4724-bba9-a7ab9a10ed6c-logs\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.421668 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-combined-ca-bundle\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.434655 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19dd0c13-b898-4147-ae5f-cbc5d4915910-kube-api-access-v577c" (OuterVolumeSpecName: "kube-api-access-v577c") pod "19dd0c13-b898-4147-ae5f-cbc5d4915910" (UID: "19dd0c13-b898-4147-ae5f-cbc5d4915910"). InnerVolumeSpecName "kube-api-access-v577c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.434786 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "19dd0c13-b898-4147-ae5f-cbc5d4915910" (UID: "19dd0c13-b898-4147-ae5f-cbc5d4915910"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.471749 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19dd0c13-b898-4147-ae5f-cbc5d4915910" (UID: "19dd0c13-b898-4147-ae5f-cbc5d4915910"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.523862 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-public-tls-certs\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.524151 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-config-data\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.524178 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-scripts\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.524214 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/858d67cb-268b-4724-bba9-a7ab9a10ed6c-logs\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.524244 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-combined-ca-bundle\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.524274 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkl6h\" (UniqueName: \"kubernetes.io/projected/858d67cb-268b-4724-bba9-a7ab9a10ed6c-kube-api-access-kkl6h\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.524316 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-internal-tls-certs\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.524385 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.524397 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v577c\" (UniqueName: \"kubernetes.io/projected/19dd0c13-b898-4147-ae5f-cbc5d4915910-kube-api-access-v577c\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.524407 4804 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.524920 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/858d67cb-268b-4724-bba9-a7ab9a10ed6c-logs\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.528495 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-scripts\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.529057 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-public-tls-certs\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.529084 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-internal-tls-certs\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.529727 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-config-data\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.532180 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-combined-ca-bundle\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.541485 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkl6h\" (UniqueName: \"kubernetes.io/projected/858d67cb-268b-4724-bba9-a7ab9a10ed6c-kube-api-access-kkl6h\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.549157 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.600166 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" path="/var/lib/kubelet/pods/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a/volumes" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.614963 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.615026 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jz9x9" event={"ID":"19dd0c13-b898-4147-ae5f-cbc5d4915910","Type":"ContainerDied","Data":"3d2919ea140840d1d3c9f9481431391d06220a9f53b8b9586d077d9974ea9874"} Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.615051 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d2919ea140840d1d3c9f9481431391d06220a9f53b8b9586d077d9974ea9874" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.617563 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.720979 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.762508 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5f97f9545f-tngcj"] Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.767422 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.768986 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5f97f9545f-tngcj"] Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.781356 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6zhqd" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.781700 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.782857 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.822450 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9cc757857-wng6k"] Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.831260 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-combined-ca-bundle\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.831333 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-config-data\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.831375 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h62hk\" (UniqueName: \"kubernetes.io/projected/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-kube-api-access-h62hk\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.831398 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-logs\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.831442 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-config-data-custom\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.863598 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.885849 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-f46489f4-x24zj"] Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.918930 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.922458 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.932982 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-f46489f4-x24zj"] Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.933519 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-combined-ca-bundle\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.933577 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-config-data\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.933599 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h62hk\" (UniqueName: \"kubernetes.io/projected/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-kube-api-access-h62hk\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.933616 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-logs\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.933636 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-config-data-custom\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.934351 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-logs\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.942836 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-combined-ca-bundle\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.943603 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-config-data\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.945617 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-config-data-custom\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.963863 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h62hk\" (UniqueName: \"kubernetes.io/projected/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-kube-api-access-h62hk\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.033900 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-k5l98"] Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.035418 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/297a0648-3cbd-4f1e-8bc4-d918a702c33b-logs\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.035458 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/297a0648-3cbd-4f1e-8bc4-d918a702c33b-combined-ca-bundle\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.035487 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/297a0648-3cbd-4f1e-8bc4-d918a702c33b-config-data\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.035586 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w29cv\" (UniqueName: \"kubernetes.io/projected/297a0648-3cbd-4f1e-8bc4-d918a702c33b-kube-api-access-w29cv\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.035609 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/297a0648-3cbd-4f1e-8bc4-d918a702c33b-config-data-custom\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.065809 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-cxg2s"] Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.067286 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.085552 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-cxg2s"] Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.094300 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6955855558-kv2ld"] Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.095905 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.099286 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.122452 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.136049 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6955855558-kv2ld"] Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.139241 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-config\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.139410 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fgvs\" (UniqueName: \"kubernetes.io/projected/b85d5058-b075-42ca-8d69-a86cfc1bd01c-kube-api-access-4fgvs\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.139465 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.139516 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w29cv\" (UniqueName: \"kubernetes.io/projected/297a0648-3cbd-4f1e-8bc4-d918a702c33b-kube-api-access-w29cv\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.139543 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.139583 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/297a0648-3cbd-4f1e-8bc4-d918a702c33b-config-data-custom\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.139636 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.139713 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/297a0648-3cbd-4f1e-8bc4-d918a702c33b-logs\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.139745 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/297a0648-3cbd-4f1e-8bc4-d918a702c33b-combined-ca-bundle\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.139773 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.139798 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/297a0648-3cbd-4f1e-8bc4-d918a702c33b-config-data\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.141328 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/297a0648-3cbd-4f1e-8bc4-d918a702c33b-logs\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.148808 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/297a0648-3cbd-4f1e-8bc4-d918a702c33b-config-data\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.173600 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/297a0648-3cbd-4f1e-8bc4-d918a702c33b-combined-ca-bundle\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.176922 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w29cv\" (UniqueName: \"kubernetes.io/projected/297a0648-3cbd-4f1e-8bc4-d918a702c33b-kube-api-access-w29cv\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.178529 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/297a0648-3cbd-4f1e-8bc4-d918a702c33b-config-data-custom\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.241681 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.241769 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.241808 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.241837 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks729\" (UniqueName: \"kubernetes.io/projected/410af4be-4a66-404d-9809-f58444bc6473-kube-api-access-ks729\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.241889 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-combined-ca-bundle\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.241912 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/410af4be-4a66-404d-9809-f58444bc6473-logs\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.241930 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.241962 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-config\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.241984 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data-custom\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.242018 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fgvs\" (UniqueName: \"kubernetes.io/projected/b85d5058-b075-42ca-8d69-a86cfc1bd01c-kube-api-access-4fgvs\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.242044 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.242870 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.243924 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.243995 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.246184 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.247057 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-config\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.270651 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fgvs\" (UniqueName: \"kubernetes.io/projected/b85d5058-b075-42ca-8d69-a86cfc1bd01c-kube-api-access-4fgvs\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.343945 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-combined-ca-bundle\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.344262 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/410af4be-4a66-404d-9809-f58444bc6473-logs\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.344287 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.344335 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data-custom\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.344510 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks729\" (UniqueName: \"kubernetes.io/projected/410af4be-4a66-404d-9809-f58444bc6473-kube-api-access-ks729\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.345311 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/410af4be-4a66-404d-9809-f58444bc6473-logs\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.350851 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.350913 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-combined-ca-bundle\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.351155 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data-custom\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.359933 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.360561 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks729\" (UniqueName: \"kubernetes.io/projected/410af4be-4a66-404d-9809-f58444bc6473-kube-api-access-ks729\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.382780 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.496711 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.592473 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.592518 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.685840 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"185b3c31-7ccc-4f8d-bcb1-20cabbf50943","Type":"ContainerStarted","Data":"eefbbd3f0b520bf32a3f3135f04b4227a82b1cef683b398cd1cff8682da24dc5"} Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.692139 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" podUID="df6e7376-a420-4a04-abf8-ab5bc3f76d7c" containerName="dnsmasq-dns" containerID="cri-o://49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968" gracePeriod=10 Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.693230 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9cc757857-wng6k" event={"ID":"30df70d3-9323-4ddd-9d1c-2dae72cff6d9","Type":"ContainerStarted","Data":"4066f7167e65b72192b5a8b8761ef6253bac87e59c26264981e57eed910d7b00"} Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.693272 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9cc757857-wng6k" event={"ID":"30df70d3-9323-4ddd-9d1c-2dae72cff6d9","Type":"ContainerStarted","Data":"818e2c29c503b423be035ac1c8c503b52ac8e692015ea1a5cf2a62ba7d1eb249"} Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.693290 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.700308 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.700941 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.730903 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-9cc757857-wng6k" podStartSLOduration=2.730881261 podStartE2EDuration="2.730881261s" podCreationTimestamp="2026-02-17 13:47:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:07.716795718 +0000 UTC m=+1301.828215055" watchObservedRunningTime="2026-02-17 13:47:07.730881261 +0000 UTC m=+1301.842300598" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.761964 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5f97f9545f-tngcj"] Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.790585 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.954228 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-f46489f4-x24zj"] Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.972143 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d69649784-lnwhw"] Feb 17 13:47:08 crc kubenswrapper[4804]: W0217 13:47:08.013643 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod858d67cb_268b_4724_bba9_a7ab9a10ed6c.slice/crio-aa5075c46c2059bce2617b72ff9095be9a4925386b6156393a536861fa43ead7 WatchSource:0}: Error finding container aa5075c46c2059bce2617b72ff9095be9a4925386b6156393a536861fa43ead7: Status 404 returned error can't find the container with id aa5075c46c2059bce2617b72ff9095be9a4925386b6156393a536861fa43ead7 Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.025727 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-67b49bc6f6-6kg64"] Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.288366 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-cxg2s"] Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.510729 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.622848 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-sb\") pod \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.622892 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-swift-storage-0\") pod \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.622949 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-config\") pod \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.622980 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr4tf\" (UniqueName: \"kubernetes.io/projected/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-kube-api-access-gr4tf\") pod \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.623066 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-nb\") pod \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.623120 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-svc\") pod \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.684047 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-kube-api-access-gr4tf" (OuterVolumeSpecName: "kube-api-access-gr4tf") pod "df6e7376-a420-4a04-abf8-ab5bc3f76d7c" (UID: "df6e7376-a420-4a04-abf8-ab5bc3f76d7c"). InnerVolumeSpecName "kube-api-access-gr4tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.743382 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr4tf\" (UniqueName: \"kubernetes.io/projected/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-kube-api-access-gr4tf\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.817850 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-config" (OuterVolumeSpecName: "config") pod "df6e7376-a420-4a04-abf8-ab5bc3f76d7c" (UID: "df6e7376-a420-4a04-abf8-ab5bc3f76d7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.818109 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "df6e7376-a420-4a04-abf8-ab5bc3f76d7c" (UID: "df6e7376-a420-4a04-abf8-ab5bc3f76d7c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.818356 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "df6e7376-a420-4a04-abf8-ab5bc3f76d7c" (UID: "df6e7376-a420-4a04-abf8-ab5bc3f76d7c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.822047 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "df6e7376-a420-4a04-abf8-ab5bc3f76d7c" (UID: "df6e7376-a420-4a04-abf8-ab5bc3f76d7c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.824732 4804 generic.go:334] "Generic (PLEG): container finished" podID="df6e7376-a420-4a04-abf8-ab5bc3f76d7c" containerID="49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968" exitCode=0 Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.824774 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.845698 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.845953 4804 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.845963 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.845971 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.861655 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "df6e7376-a420-4a04-abf8-ab5bc3f76d7c" (UID: "df6e7376-a420-4a04-abf8-ab5bc3f76d7c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.950094 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.036888 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6955855558-kv2ld"] Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.036952 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.036966 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"185b3c31-7ccc-4f8d-bcb1-20cabbf50943","Type":"ContainerStarted","Data":"953b973c887ab4f0021ac7303d1d54237d7d43957bc03fe8e6222019de4450e9"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.036985 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f97f9545f-tngcj" event={"ID":"c7f4e4c3-9ec8-4923-bf7b-4058899e863f","Type":"ContainerStarted","Data":"6d4b3efff11097530d0823774a5b359f486e2f4a1ee72bdad83ea74ea46124cd"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.037017 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d69649784-lnwhw" event={"ID":"858d67cb-268b-4724-bba9-a7ab9a10ed6c","Type":"ContainerStarted","Data":"d142c1aa2575d7b9e84287db646c2533a45b3dabc061a435a7b2335843cef639"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.037030 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d69649784-lnwhw" event={"ID":"858d67cb-268b-4724-bba9-a7ab9a10ed6c","Type":"ContainerStarted","Data":"aa5075c46c2059bce2617b72ff9095be9a4925386b6156393a536861fa43ead7"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.037038 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f46489f4-x24zj" event={"ID":"297a0648-3cbd-4f1e-8bc4-d918a702c33b","Type":"ContainerStarted","Data":"cb47a75c1cfc217b32c8ce0b08563ac965de4aa9dba350829c6314a0811ff20a"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.037047 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67b49bc6f6-6kg64" event={"ID":"8c441055-8615-497e-8754-d107b3be24c7","Type":"ContainerStarted","Data":"e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.037058 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67b49bc6f6-6kg64" event={"ID":"8c441055-8615-497e-8754-d107b3be24c7","Type":"ContainerStarted","Data":"ba0cc2230bff6e65b06b28d38b4ed605a390c7cff5692c7c90bbd33cf0934ef3"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.037066 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" event={"ID":"df6e7376-a420-4a04-abf8-ab5bc3f76d7c","Type":"ContainerDied","Data":"49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.037101 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" event={"ID":"df6e7376-a420-4a04-abf8-ab5bc3f76d7c","Type":"ContainerDied","Data":"acc1d16ca31ae16b95fd7513bacd065031f5a80799a0b49cb8f97e1864a0396a"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.037111 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" event={"ID":"b85d5058-b075-42ca-8d69-a86cfc1bd01c","Type":"ContainerStarted","Data":"1ceb04ce2633cdf168f7ec2c7223a7b5436da513a4112c0b4cec53ae79c55d6e"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.037130 4804 scope.go:117] "RemoveContainer" containerID="49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.132755 4804 scope.go:117] "RemoveContainer" containerID="d94c4e75462da55817e78fa331ff2dfa0d9f3a19457e74a4ec47a997d6a72b47" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.192815 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-k5l98"] Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.213648 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-k5l98"] Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.219715 4804 scope.go:117] "RemoveContainer" containerID="49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968" Feb 17 13:47:09 crc kubenswrapper[4804]: E0217 13:47:09.221490 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968\": container with ID starting with 49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968 not found: ID does not exist" containerID="49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.221546 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968"} err="failed to get container status \"49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968\": rpc error: code = NotFound desc = could not find container \"49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968\": container with ID starting with 49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968 not found: ID does not exist" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.221569 4804 scope.go:117] "RemoveContainer" containerID="d94c4e75462da55817e78fa331ff2dfa0d9f3a19457e74a4ec47a997d6a72b47" Feb 17 13:47:09 crc kubenswrapper[4804]: E0217 13:47:09.222325 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d94c4e75462da55817e78fa331ff2dfa0d9f3a19457e74a4ec47a997d6a72b47\": container with ID starting with d94c4e75462da55817e78fa331ff2dfa0d9f3a19457e74a4ec47a997d6a72b47 not found: ID does not exist" containerID="d94c4e75462da55817e78fa331ff2dfa0d9f3a19457e74a4ec47a997d6a72b47" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.222377 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d94c4e75462da55817e78fa331ff2dfa0d9f3a19457e74a4ec47a997d6a72b47"} err="failed to get container status \"d94c4e75462da55817e78fa331ff2dfa0d9f3a19457e74a4ec47a997d6a72b47\": rpc error: code = NotFound desc = could not find container \"d94c4e75462da55817e78fa331ff2dfa0d9f3a19457e74a4ec47a997d6a72b47\": container with ID starting with d94c4e75462da55817e78fa331ff2dfa0d9f3a19457e74a4ec47a997d6a72b47 not found: ID does not exist" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.846791 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6955855558-kv2ld" event={"ID":"410af4be-4a66-404d-9809-f58444bc6473","Type":"ContainerStarted","Data":"7797ca6ae3158f51032a378dceeadfa4a4aab48a558972be10499d12d6917e06"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.847145 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6955855558-kv2ld" event={"ID":"410af4be-4a66-404d-9809-f58444bc6473","Type":"ContainerStarted","Data":"3225329e6ab93eac20c2d9227e1f3df46c5bdbdd2affe08906fba20733bf989a"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.847157 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6955855558-kv2ld" event={"ID":"410af4be-4a66-404d-9809-f58444bc6473","Type":"ContainerStarted","Data":"0345398b72b1e224ae72d70254cce0d2edce23c11bae1237519d2ed6af3adbe1"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.851460 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.851495 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.858032 4804 generic.go:334] "Generic (PLEG): container finished" podID="b85d5058-b075-42ca-8d69-a86cfc1bd01c" containerID="61ec7f44479bb9daa4c9c91948a35a994fd304cc644764be5a4bbb119a672347" exitCode=0 Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.858359 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" event={"ID":"b85d5058-b075-42ca-8d69-a86cfc1bd01c","Type":"ContainerStarted","Data":"d980c32d83966a44bf55958cd6329cf2bd80a3344dee8dab3f8264dc4b275f31"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.858380 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" event={"ID":"b85d5058-b075-42ca-8d69-a86cfc1bd01c","Type":"ContainerDied","Data":"61ec7f44479bb9daa4c9c91948a35a994fd304cc644764be5a4bbb119a672347"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.859029 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.869135 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"185b3c31-7ccc-4f8d-bcb1-20cabbf50943","Type":"ContainerStarted","Data":"b16be4415372b664ded381a846b14c2c2406261a1bf459398d286a236434a0be"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.875023 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6955855558-kv2ld" podStartSLOduration=3.874999379 podStartE2EDuration="3.874999379s" podCreationTimestamp="2026-02-17 13:47:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:09.867991619 +0000 UTC m=+1303.979410946" watchObservedRunningTime="2026-02-17 13:47:09.874999379 +0000 UTC m=+1303.986418716" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.883665 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d69649784-lnwhw" event={"ID":"858d67cb-268b-4724-bba9-a7ab9a10ed6c","Type":"ContainerStarted","Data":"1718bb4d39cef70818fc83641992541a17ca3fafedb93bd6235c83058144fd75"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.884567 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.914153 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67b49bc6f6-6kg64" event={"ID":"8c441055-8615-497e-8754-d107b3be24c7","Type":"ContainerStarted","Data":"a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.914243 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.914319 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.916565 4804 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.919940 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" podStartSLOduration=3.919922781 podStartE2EDuration="3.919922781s" podCreationTimestamp="2026-02-17 13:47:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:09.890736014 +0000 UTC m=+1304.002155361" watchObservedRunningTime="2026-02-17 13:47:09.919922781 +0000 UTC m=+1304.031342118" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.930692 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6d69649784-lnwhw" podStartSLOduration=3.930672989 podStartE2EDuration="3.930672989s" podCreationTimestamp="2026-02-17 13:47:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:09.930028729 +0000 UTC m=+1304.041448066" watchObservedRunningTime="2026-02-17 13:47:09.930672989 +0000 UTC m=+1304.042092326" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.957772 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.9577567 podStartE2EDuration="4.9577567s" podCreationTimestamp="2026-02-17 13:47:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:09.949506411 +0000 UTC m=+1304.060925748" watchObservedRunningTime="2026-02-17 13:47:09.9577567 +0000 UTC m=+1304.069176027" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.991244 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7cc7c97fdd-bhd7w"] Feb 17 13:47:09 crc kubenswrapper[4804]: E0217 13:47:09.991669 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df6e7376-a420-4a04-abf8-ab5bc3f76d7c" containerName="dnsmasq-dns" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.991680 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="df6e7376-a420-4a04-abf8-ab5bc3f76d7c" containerName="dnsmasq-dns" Feb 17 13:47:09 crc kubenswrapper[4804]: E0217 13:47:09.991695 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df6e7376-a420-4a04-abf8-ab5bc3f76d7c" containerName="init" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.991701 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="df6e7376-a420-4a04-abf8-ab5bc3f76d7c" containerName="init" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.991899 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="df6e7376-a420-4a04-abf8-ab5bc3f76d7c" containerName="dnsmasq-dns" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.992842 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.007817 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.008070 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.008671 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-67b49bc6f6-6kg64" podStartSLOduration=5.00865277 podStartE2EDuration="5.00865277s" podCreationTimestamp="2026-02-17 13:47:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:09.980667301 +0000 UTC m=+1304.092086638" watchObservedRunningTime="2026-02-17 13:47:10.00865277 +0000 UTC m=+1304.120072107" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.049945 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7cc7c97fdd-bhd7w"] Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.073994 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b89da32-9537-4c7b-a266-0d38ac52b069-logs\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.074048 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-config-data-custom\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.076449 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-config-data\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.076594 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-public-tls-certs\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.076649 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h6vr\" (UniqueName: \"kubernetes.io/projected/2b89da32-9537-4c7b-a266-0d38ac52b069-kube-api-access-8h6vr\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.076904 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-internal-tls-certs\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.077009 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-combined-ca-bundle\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.182120 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b89da32-9537-4c7b-a266-0d38ac52b069-logs\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.182405 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-config-data-custom\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.182513 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-config-data\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.182599 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-public-tls-certs\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.182675 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h6vr\" (UniqueName: \"kubernetes.io/projected/2b89da32-9537-4c7b-a266-0d38ac52b069-kube-api-access-8h6vr\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.182762 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-internal-tls-certs\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.182825 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-combined-ca-bundle\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.184893 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b89da32-9537-4c7b-a266-0d38ac52b069-logs\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.199762 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-combined-ca-bundle\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.199804 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-public-tls-certs\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.203455 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-internal-tls-certs\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.203586 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-config-data-custom\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.205329 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-config-data\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.224953 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h6vr\" (UniqueName: \"kubernetes.io/projected/2b89da32-9537-4c7b-a266-0d38ac52b069-kube-api-access-8h6vr\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.334873 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.605422 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df6e7376-a420-4a04-abf8-ab5bc3f76d7c" path="/var/lib/kubelet/pods/df6e7376-a420-4a04-abf8-ab5bc3f76d7c/volumes" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.757864 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.759741 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.863710 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7cc7c97fdd-bhd7w"] Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.940751 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:11 crc kubenswrapper[4804]: I0217 13:47:11.951139 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cc7c97fdd-bhd7w" event={"ID":"2b89da32-9537-4c7b-a266-0d38ac52b069","Type":"ContainerStarted","Data":"c8ca657f6236874f1b16afcede029cb285c5198e2af52a99411a433e516ce462"} Feb 17 13:47:12 crc kubenswrapper[4804]: I0217 13:47:12.963610 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f46489f4-x24zj" event={"ID":"297a0648-3cbd-4f1e-8bc4-d918a702c33b","Type":"ContainerStarted","Data":"3328fd468939de33640af98a7f6863a2bddfa10ef51851ae30229e4ba4d549bf"} Feb 17 13:47:12 crc kubenswrapper[4804]: I0217 13:47:12.963976 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f46489f4-x24zj" event={"ID":"297a0648-3cbd-4f1e-8bc4-d918a702c33b","Type":"ContainerStarted","Data":"fd1c4f6788f3648bec6c2023e257e90bce62947fdd08eff0aa5554c1655bb985"} Feb 17 13:47:12 crc kubenswrapper[4804]: I0217 13:47:12.966298 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f97f9545f-tngcj" event={"ID":"c7f4e4c3-9ec8-4923-bf7b-4058899e863f","Type":"ContainerStarted","Data":"69d1f41d1a7f6153e6465f83519516ab918b4bcb906ffb3bc452ca225cddd7da"} Feb 17 13:47:12 crc kubenswrapper[4804]: I0217 13:47:12.968055 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cc7c97fdd-bhd7w" event={"ID":"2b89da32-9537-4c7b-a266-0d38ac52b069","Type":"ContainerStarted","Data":"bbb93f0a32531c05e1658bd58d304296c70a556976494d7b1c76d834a1ac7d52"} Feb 17 13:47:12 crc kubenswrapper[4804]: I0217 13:47:12.970352 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f9zkj" event={"ID":"02a921c8-6579-451b-beaf-9832cf900668","Type":"ContainerStarted","Data":"2b0f9e8901b98239ec002ee748081354dc9e4f43d7161d56dae423af6c1770d2"} Feb 17 13:47:12 crc kubenswrapper[4804]: I0217 13:47:12.987669 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-f46489f4-x24zj" podStartSLOduration=3.300717027 podStartE2EDuration="6.98765027s" podCreationTimestamp="2026-02-17 13:47:06 +0000 UTC" firstStartedPulling="2026-02-17 13:47:07.959842659 +0000 UTC m=+1302.071261996" lastFinishedPulling="2026-02-17 13:47:11.646775902 +0000 UTC m=+1305.758195239" observedRunningTime="2026-02-17 13:47:12.97969222 +0000 UTC m=+1307.091111557" watchObservedRunningTime="2026-02-17 13:47:12.98765027 +0000 UTC m=+1307.099069607" Feb 17 13:47:13 crc kubenswrapper[4804]: I0217 13:47:13.000114 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-f9zkj" podStartSLOduration=2.790507695 podStartE2EDuration="48.000090331s" podCreationTimestamp="2026-02-17 13:46:25 +0000 UTC" firstStartedPulling="2026-02-17 13:46:26.457246526 +0000 UTC m=+1260.568665863" lastFinishedPulling="2026-02-17 13:47:11.666829162 +0000 UTC m=+1305.778248499" observedRunningTime="2026-02-17 13:47:12.998054067 +0000 UTC m=+1307.109473404" watchObservedRunningTime="2026-02-17 13:47:13.000090331 +0000 UTC m=+1307.111509678" Feb 17 13:47:14 crc kubenswrapper[4804]: I0217 13:47:14.870507 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:47:14 crc kubenswrapper[4804]: I0217 13:47:14.872031 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:47:14 crc kubenswrapper[4804]: I0217 13:47:14.882677 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-9ffb6f5c6-fczv5" podUID="e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Feb 17 13:47:16 crc kubenswrapper[4804]: I0217 13:47:16.090024 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 13:47:16 crc kubenswrapper[4804]: I0217 13:47:16.090821 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 13:47:16 crc kubenswrapper[4804]: I0217 13:47:16.116543 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 13:47:16 crc kubenswrapper[4804]: I0217 13:47:16.127878 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 13:47:17 crc kubenswrapper[4804]: I0217 13:47:17.020872 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f97f9545f-tngcj" event={"ID":"c7f4e4c3-9ec8-4923-bf7b-4058899e863f","Type":"ContainerStarted","Data":"53928c720a0f168a47cb2e57fd13c48b24857ae2a6615ea52cf04473b07d7cd1"} Feb 17 13:47:17 crc kubenswrapper[4804]: I0217 13:47:17.024959 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cc7c97fdd-bhd7w" event={"ID":"2b89da32-9537-4c7b-a266-0d38ac52b069","Type":"ContainerStarted","Data":"1e0154e2eb8f5cf5b5fb7a8b17bfce3481ede2d328565aaa84ed6a947e22e95b"} Feb 17 13:47:17 crc kubenswrapper[4804]: I0217 13:47:17.024995 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 13:47:17 crc kubenswrapper[4804]: I0217 13:47:17.025008 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:17 crc kubenswrapper[4804]: I0217 13:47:17.025016 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:17 crc kubenswrapper[4804]: I0217 13:47:17.025026 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 13:47:17 crc kubenswrapper[4804]: I0217 13:47:17.042776 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5f97f9545f-tngcj" podStartSLOduration=7.236516192 podStartE2EDuration="11.042759515s" podCreationTimestamp="2026-02-17 13:47:06 +0000 UTC" firstStartedPulling="2026-02-17 13:47:07.842246242 +0000 UTC m=+1301.953665579" lastFinishedPulling="2026-02-17 13:47:11.648489555 +0000 UTC m=+1305.759908902" observedRunningTime="2026-02-17 13:47:17.034178496 +0000 UTC m=+1311.145597833" watchObservedRunningTime="2026-02-17 13:47:17.042759515 +0000 UTC m=+1311.154178852" Feb 17 13:47:17 crc kubenswrapper[4804]: I0217 13:47:17.082243 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7cc7c97fdd-bhd7w" podStartSLOduration=8.082221876 podStartE2EDuration="8.082221876s" podCreationTimestamp="2026-02-17 13:47:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:17.060102931 +0000 UTC m=+1311.171522278" watchObservedRunningTime="2026-02-17 13:47:17.082221876 +0000 UTC m=+1311.193641203" Feb 17 13:47:17 crc kubenswrapper[4804]: E0217 13:47:17.193787 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" Feb 17 13:47:17 crc kubenswrapper[4804]: I0217 13:47:17.384374 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:17 crc kubenswrapper[4804]: I0217 13:47:17.502740 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mtwfj"] Feb 17 13:47:17 crc kubenswrapper[4804]: I0217 13:47:17.502990 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" podUID="1fa3f342-a062-421d-8c06-f53468a8db00" containerName="dnsmasq-dns" containerID="cri-o://b9a3f395e90e39b7c24df35dd6e3f0dd7e4bcbc43cd3d4f5483755287749ca41" gracePeriod=10 Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.054173 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5ccd477-88cd-4284-9de7-f336def1c7a1","Type":"ContainerStarted","Data":"2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5"} Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.054880 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="ceilometer-notification-agent" containerID="cri-o://c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67" gracePeriod=30 Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.055049 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.055173 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="proxy-httpd" containerID="cri-o://2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5" gracePeriod=30 Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.055245 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="sg-core" containerID="cri-o://c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5" gracePeriod=30 Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.089462 4804 generic.go:334] "Generic (PLEG): container finished" podID="1fa3f342-a062-421d-8c06-f53468a8db00" containerID="b9a3f395e90e39b7c24df35dd6e3f0dd7e4bcbc43cd3d4f5483755287749ca41" exitCode=0 Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.089649 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" event={"ID":"1fa3f342-a062-421d-8c06-f53468a8db00","Type":"ContainerDied","Data":"b9a3f395e90e39b7c24df35dd6e3f0dd7e4bcbc43cd3d4f5483755287749ca41"} Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.163238 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.249683 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-swift-storage-0\") pod \"1fa3f342-a062-421d-8c06-f53468a8db00\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.249802 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-svc\") pod \"1fa3f342-a062-421d-8c06-f53468a8db00\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.249883 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-config\") pod \"1fa3f342-a062-421d-8c06-f53468a8db00\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.249908 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bfbg\" (UniqueName: \"kubernetes.io/projected/1fa3f342-a062-421d-8c06-f53468a8db00-kube-api-access-9bfbg\") pod \"1fa3f342-a062-421d-8c06-f53468a8db00\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.249956 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-sb\") pod \"1fa3f342-a062-421d-8c06-f53468a8db00\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.250045 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-nb\") pod \"1fa3f342-a062-421d-8c06-f53468a8db00\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.256348 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa3f342-a062-421d-8c06-f53468a8db00-kube-api-access-9bfbg" (OuterVolumeSpecName: "kube-api-access-9bfbg") pod "1fa3f342-a062-421d-8c06-f53468a8db00" (UID: "1fa3f342-a062-421d-8c06-f53468a8db00"). InnerVolumeSpecName "kube-api-access-9bfbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.313952 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1fa3f342-a062-421d-8c06-f53468a8db00" (UID: "1fa3f342-a062-421d-8c06-f53468a8db00"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.329686 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1fa3f342-a062-421d-8c06-f53468a8db00" (UID: "1fa3f342-a062-421d-8c06-f53468a8db00"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.329716 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1fa3f342-a062-421d-8c06-f53468a8db00" (UID: "1fa3f342-a062-421d-8c06-f53468a8db00"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.338136 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1fa3f342-a062-421d-8c06-f53468a8db00" (UID: "1fa3f342-a062-421d-8c06-f53468a8db00"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.353322 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.353431 4804 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.353448 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.353460 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bfbg\" (UniqueName: \"kubernetes.io/projected/1fa3f342-a062-421d-8c06-f53468a8db00-kube-api-access-9bfbg\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.353474 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.360661 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-config" (OuterVolumeSpecName: "config") pod "1fa3f342-a062-421d-8c06-f53468a8db00" (UID: "1fa3f342-a062-421d-8c06-f53468a8db00"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.454502 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.790717 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.098599 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" event={"ID":"1fa3f342-a062-421d-8c06-f53468a8db00","Type":"ContainerDied","Data":"cdb7f4453bccc68342ae31db3bbfc987aaa5d47b283d10e4a4bd0daebe7bbf50"} Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.098635 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.098653 4804 scope.go:117] "RemoveContainer" containerID="b9a3f395e90e39b7c24df35dd6e3f0dd7e4bcbc43cd3d4f5483755287749ca41" Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.104973 4804 generic.go:334] "Generic (PLEG): container finished" podID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerID="2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5" exitCode=0 Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.105008 4804 generic.go:334] "Generic (PLEG): container finished" podID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerID="c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5" exitCode=2 Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.105050 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5ccd477-88cd-4284-9de7-f336def1c7a1","Type":"ContainerDied","Data":"2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5"} Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.105095 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5ccd477-88cd-4284-9de7-f336def1c7a1","Type":"ContainerDied","Data":"c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5"} Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.109824 4804 generic.go:334] "Generic (PLEG): container finished" podID="02a921c8-6579-451b-beaf-9832cf900668" containerID="2b0f9e8901b98239ec002ee748081354dc9e4f43d7161d56dae423af6c1770d2" exitCode=0 Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.109902 4804 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.109913 4804 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.110110 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f9zkj" event={"ID":"02a921c8-6579-451b-beaf-9832cf900668","Type":"ContainerDied","Data":"2b0f9e8901b98239ec002ee748081354dc9e4f43d7161d56dae423af6c1770d2"} Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.138423 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mtwfj"] Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.148014 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mtwfj"] Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.272497 4804 scope.go:117] "RemoveContainer" containerID="63be9f06e01e3909b7ff94ea9b177c0a528139e2942719322a381a426d4f2574" Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.527879 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.614757 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.745503 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.752818 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.501276 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.584329 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fa3f342-a062-421d-8c06-f53468a8db00" path="/var/lib/kubelet/pods/1fa3f342-a062-421d-8c06-f53468a8db00/volumes" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.698768 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-combined-ca-bundle\") pod \"02a921c8-6579-451b-beaf-9832cf900668\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.699145 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/02a921c8-6579-451b-beaf-9832cf900668-etc-machine-id\") pod \"02a921c8-6579-451b-beaf-9832cf900668\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.699178 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-scripts\") pod \"02a921c8-6579-451b-beaf-9832cf900668\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.699241 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a921c8-6579-451b-beaf-9832cf900668-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "02a921c8-6579-451b-beaf-9832cf900668" (UID: "02a921c8-6579-451b-beaf-9832cf900668"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.699307 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-db-sync-config-data\") pod \"02a921c8-6579-451b-beaf-9832cf900668\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.699392 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-config-data\") pod \"02a921c8-6579-451b-beaf-9832cf900668\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.699492 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trmx2\" (UniqueName: \"kubernetes.io/projected/02a921c8-6579-451b-beaf-9832cf900668-kube-api-access-trmx2\") pod \"02a921c8-6579-451b-beaf-9832cf900668\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.700192 4804 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/02a921c8-6579-451b-beaf-9832cf900668-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.708384 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "02a921c8-6579-451b-beaf-9832cf900668" (UID: "02a921c8-6579-451b-beaf-9832cf900668"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.712552 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02a921c8-6579-451b-beaf-9832cf900668-kube-api-access-trmx2" (OuterVolumeSpecName: "kube-api-access-trmx2") pod "02a921c8-6579-451b-beaf-9832cf900668" (UID: "02a921c8-6579-451b-beaf-9832cf900668"). InnerVolumeSpecName "kube-api-access-trmx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.716300 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-scripts" (OuterVolumeSpecName: "scripts") pod "02a921c8-6579-451b-beaf-9832cf900668" (UID: "02a921c8-6579-451b-beaf-9832cf900668"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.762021 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02a921c8-6579-451b-beaf-9832cf900668" (UID: "02a921c8-6579-451b-beaf-9832cf900668"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.762445 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-config-data" (OuterVolumeSpecName: "config-data") pod "02a921c8-6579-451b-beaf-9832cf900668" (UID: "02a921c8-6579-451b-beaf-9832cf900668"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.801786 4804 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.801819 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.801830 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trmx2\" (UniqueName: \"kubernetes.io/projected/02a921c8-6579-451b-beaf-9832cf900668-kube-api-access-trmx2\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.801841 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.801853 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.130820 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f9zkj" event={"ID":"02a921c8-6579-451b-beaf-9832cf900668","Type":"ContainerDied","Data":"ef91adb98631667de4f24978b6722ac67d6e2cf414b43820776723b677addd2c"} Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.130847 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.130864 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef91adb98631667de4f24978b6722ac67d6e2cf414b43820776723b677addd2c" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.443496 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 13:47:21 crc kubenswrapper[4804]: E0217 13:47:21.447013 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa3f342-a062-421d-8c06-f53468a8db00" containerName="init" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.447041 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa3f342-a062-421d-8c06-f53468a8db00" containerName="init" Feb 17 13:47:21 crc kubenswrapper[4804]: E0217 13:47:21.447081 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a921c8-6579-451b-beaf-9832cf900668" containerName="cinder-db-sync" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.447090 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a921c8-6579-451b-beaf-9832cf900668" containerName="cinder-db-sync" Feb 17 13:47:21 crc kubenswrapper[4804]: E0217 13:47:21.447102 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa3f342-a062-421d-8c06-f53468a8db00" containerName="dnsmasq-dns" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.447108 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa3f342-a062-421d-8c06-f53468a8db00" containerName="dnsmasq-dns" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.447327 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa3f342-a062-421d-8c06-f53468a8db00" containerName="dnsmasq-dns" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.447348 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a921c8-6579-451b-beaf-9832cf900668" containerName="cinder-db-sync" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.448260 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.453049 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.453117 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.453398 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-r5hqb" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.453511 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.457377 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.518249 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-mtrxj"] Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.519946 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.548133 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-mtrxj"] Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.623629 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-svc\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.623708 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.623736 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4p6n\" (UniqueName: \"kubernetes.io/projected/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-kube-api-access-t4p6n\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.623773 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.623826 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-config\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.623866 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.623892 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.623915 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-scripts\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.623939 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9ztf\" (UniqueName: \"kubernetes.io/projected/737ac1d8-ad22-4a56-b203-eb2212949fb6-kube-api-access-x9ztf\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.623959 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.624019 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.624087 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.725956 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-svc\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726035 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726064 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4p6n\" (UniqueName: \"kubernetes.io/projected/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-kube-api-access-t4p6n\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726101 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726150 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-config\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726191 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726231 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726257 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-scripts\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726280 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9ztf\" (UniqueName: \"kubernetes.io/projected/737ac1d8-ad22-4a56-b203-eb2212949fb6-kube-api-access-x9ztf\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726309 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726366 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726422 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726852 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-svc\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.727410 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.727472 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.727857 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.729010 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.729965 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-config\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.733997 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.735615 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-scripts\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.735984 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.739719 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.751482 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4p6n\" (UniqueName: \"kubernetes.io/projected/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-kube-api-access-t4p6n\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.758814 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9ztf\" (UniqueName: \"kubernetes.io/projected/737ac1d8-ad22-4a56-b203-eb2212949fb6-kube-api-access-x9ztf\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.758872 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.760347 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.762963 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.774807 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.777525 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.864738 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.930160 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfrmd\" (UniqueName: \"kubernetes.io/projected/0586d6d2-92ba-4c34-9153-3de3fe22add2-kube-api-access-wfrmd\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.930219 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.930251 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0586d6d2-92ba-4c34-9153-3de3fe22add2-logs\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.930266 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0586d6d2-92ba-4c34-9153-3de3fe22add2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.930317 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-scripts\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.930335 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.930392 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data-custom\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.032065 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-scripts\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.032420 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.032505 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data-custom\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.032654 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfrmd\" (UniqueName: \"kubernetes.io/projected/0586d6d2-92ba-4c34-9153-3de3fe22add2-kube-api-access-wfrmd\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.032702 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.032740 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0586d6d2-92ba-4c34-9153-3de3fe22add2-logs\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.032761 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0586d6d2-92ba-4c34-9153-3de3fe22add2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.032879 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0586d6d2-92ba-4c34-9153-3de3fe22add2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.039758 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-scripts\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.043283 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0586d6d2-92ba-4c34-9153-3de3fe22add2-logs\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.043830 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.051104 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data-custom\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.055445 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.103480 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfrmd\" (UniqueName: \"kubernetes.io/projected/0586d6d2-92ba-4c34-9153-3de3fe22add2-kube-api-access-wfrmd\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.215965 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.233830 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.625171 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-mtrxj"] Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.799115 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.176721 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.189138 4804 generic.go:334] "Generic (PLEG): container finished" podID="737ac1d8-ad22-4a56-b203-eb2212949fb6" containerID="694f46d3a17de98217359c48adef8a435392ec51f1c7919012624d49fd6b0165" exitCode=0 Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.189231 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" event={"ID":"737ac1d8-ad22-4a56-b203-eb2212949fb6","Type":"ContainerDied","Data":"694f46d3a17de98217359c48adef8a435392ec51f1c7919012624d49fd6b0165"} Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.189264 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" event={"ID":"737ac1d8-ad22-4a56-b203-eb2212949fb6","Type":"ContainerStarted","Data":"2d996d992d2a3254b879bd96b12e636e65525644b7181f7f3f61897c257c69b0"} Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.194979 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0586d6d2-92ba-4c34-9153-3de3fe22add2","Type":"ContainerStarted","Data":"3fb4a87b299d4238d785f3f50f31c965eb4578228964e6ab3bbb8b0fd289c1da"} Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.207074 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6","Type":"ContainerStarted","Data":"2618a56a4b1417c4a63c4fff93e5f2af5e701449d4dbe686563ceaa84785f504"} Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.215546 4804 generic.go:334] "Generic (PLEG): container finished" podID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerID="c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67" exitCode=0 Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.215609 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5ccd477-88cd-4284-9de7-f336def1c7a1","Type":"ContainerDied","Data":"c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67"} Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.215641 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5ccd477-88cd-4284-9de7-f336def1c7a1","Type":"ContainerDied","Data":"9ab7cd127419d840e73931cd84e8a62cca6dbdb1c678768ac1433e7970f3f9a0"} Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.215664 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.215704 4804 scope.go:117] "RemoveContainer" containerID="2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.247580 4804 scope.go:117] "RemoveContainer" containerID="c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.264869 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-sg-core-conf-yaml\") pod \"e5ccd477-88cd-4284-9de7-f336def1c7a1\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.264924 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-scripts\") pod \"e5ccd477-88cd-4284-9de7-f336def1c7a1\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.264949 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-combined-ca-bundle\") pod \"e5ccd477-88cd-4284-9de7-f336def1c7a1\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.265007 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jjjd\" (UniqueName: \"kubernetes.io/projected/e5ccd477-88cd-4284-9de7-f336def1c7a1-kube-api-access-8jjjd\") pod \"e5ccd477-88cd-4284-9de7-f336def1c7a1\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.265112 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-run-httpd\") pod \"e5ccd477-88cd-4284-9de7-f336def1c7a1\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.265132 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-config-data\") pod \"e5ccd477-88cd-4284-9de7-f336def1c7a1\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.265395 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-log-httpd\") pod \"e5ccd477-88cd-4284-9de7-f336def1c7a1\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.268538 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e5ccd477-88cd-4284-9de7-f336def1c7a1" (UID: "e5ccd477-88cd-4284-9de7-f336def1c7a1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.268832 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e5ccd477-88cd-4284-9de7-f336def1c7a1" (UID: "e5ccd477-88cd-4284-9de7-f336def1c7a1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.279489 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5ccd477-88cd-4284-9de7-f336def1c7a1-kube-api-access-8jjjd" (OuterVolumeSpecName: "kube-api-access-8jjjd") pod "e5ccd477-88cd-4284-9de7-f336def1c7a1" (UID: "e5ccd477-88cd-4284-9de7-f336def1c7a1"). InnerVolumeSpecName "kube-api-access-8jjjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.282334 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-scripts" (OuterVolumeSpecName: "scripts") pod "e5ccd477-88cd-4284-9de7-f336def1c7a1" (UID: "e5ccd477-88cd-4284-9de7-f336def1c7a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.307388 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e5ccd477-88cd-4284-9de7-f336def1c7a1" (UID: "e5ccd477-88cd-4284-9de7-f336def1c7a1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.316662 4804 scope.go:117] "RemoveContainer" containerID="c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.361349 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5ccd477-88cd-4284-9de7-f336def1c7a1" (UID: "e5ccd477-88cd-4284-9de7-f336def1c7a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.369580 4804 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.369613 4804 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.369623 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.369633 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.369642 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jjjd\" (UniqueName: \"kubernetes.io/projected/e5ccd477-88cd-4284-9de7-f336def1c7a1-kube-api-access-8jjjd\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.369651 4804 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.375834 4804 scope.go:117] "RemoveContainer" containerID="2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5" Feb 17 13:47:23 crc kubenswrapper[4804]: E0217 13:47:23.376542 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5\": container with ID starting with 2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5 not found: ID does not exist" containerID="2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.376598 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5"} err="failed to get container status \"2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5\": rpc error: code = NotFound desc = could not find container \"2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5\": container with ID starting with 2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5 not found: ID does not exist" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.376629 4804 scope.go:117] "RemoveContainer" containerID="c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5" Feb 17 13:47:23 crc kubenswrapper[4804]: E0217 13:47:23.377137 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5\": container with ID starting with c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5 not found: ID does not exist" containerID="c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.377216 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5"} err="failed to get container status \"c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5\": rpc error: code = NotFound desc = could not find container \"c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5\": container with ID starting with c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5 not found: ID does not exist" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.377236 4804 scope.go:117] "RemoveContainer" containerID="c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67" Feb 17 13:47:23 crc kubenswrapper[4804]: E0217 13:47:23.382634 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67\": container with ID starting with c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67 not found: ID does not exist" containerID="c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.382674 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67"} err="failed to get container status \"c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67\": rpc error: code = NotFound desc = could not find container \"c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67\": container with ID starting with c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67 not found: ID does not exist" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.384802 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-config-data" (OuterVolumeSpecName: "config-data") pod "e5ccd477-88cd-4284-9de7-f336def1c7a1" (UID: "e5ccd477-88cd-4284-9de7-f336def1c7a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.471791 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.579713 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.594668 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.622845 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:23 crc kubenswrapper[4804]: E0217 13:47:23.623662 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="proxy-httpd" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.623758 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="proxy-httpd" Feb 17 13:47:23 crc kubenswrapper[4804]: E0217 13:47:23.623854 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="sg-core" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.623924 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="sg-core" Feb 17 13:47:23 crc kubenswrapper[4804]: E0217 13:47:23.623999 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="ceilometer-notification-agent" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.624083 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="ceilometer-notification-agent" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.624392 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="ceilometer-notification-agent" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.624485 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="sg-core" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.624575 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="proxy-httpd" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.626727 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.634507 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.635110 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.641375 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.732402 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.798285 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-scripts\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.798331 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljpm8\" (UniqueName: \"kubernetes.io/projected/da1535f5-a225-489d-af6d-cbfa6042d239-kube-api-access-ljpm8\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.798365 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.798648 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-config-data\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.798685 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.798755 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-run-httpd\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.798809 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-log-httpd\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.900493 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-scripts\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.900769 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljpm8\" (UniqueName: \"kubernetes.io/projected/da1535f5-a225-489d-af6d-cbfa6042d239-kube-api-access-ljpm8\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.900866 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.900989 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-config-data\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.901081 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.901166 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-run-httpd\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.901250 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-log-httpd\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.902016 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-log-httpd\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.902169 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-run-httpd\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.908412 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-scripts\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.911499 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-config-data\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.919332 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.922903 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.924111 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljpm8\" (UniqueName: \"kubernetes.io/projected/da1535f5-a225-489d-af6d-cbfa6042d239-kube-api-access-ljpm8\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.983032 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:47:24 crc kubenswrapper[4804]: I0217 13:47:24.236530 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0586d6d2-92ba-4c34-9153-3de3fe22add2","Type":"ContainerStarted","Data":"75dd018c8e9cd8c6677bcd27c77a5f1fcc5d2bc1ba3b553480e69ca53b36f5e6"} Feb 17 13:47:24 crc kubenswrapper[4804]: I0217 13:47:24.238329 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6","Type":"ContainerStarted","Data":"2b63a870c70f085dee0bf900b7beba65015e3aff6e6541b29544712e34dd77a9"} Feb 17 13:47:24 crc kubenswrapper[4804]: I0217 13:47:24.292342 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" event={"ID":"737ac1d8-ad22-4a56-b203-eb2212949fb6","Type":"ContainerStarted","Data":"be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7"} Feb 17 13:47:24 crc kubenswrapper[4804]: I0217 13:47:24.292558 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:24 crc kubenswrapper[4804]: I0217 13:47:24.316763 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" podStartSLOduration=3.316745933 podStartE2EDuration="3.316745933s" podCreationTimestamp="2026-02-17 13:47:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:24.314544544 +0000 UTC m=+1318.425963881" watchObservedRunningTime="2026-02-17 13:47:24.316745933 +0000 UTC m=+1318.428165270" Feb 17 13:47:24 crc kubenswrapper[4804]: I0217 13:47:24.564689 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:24 crc kubenswrapper[4804]: I0217 13:47:24.611751 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" path="/var/lib/kubelet/pods/e5ccd477-88cd-4284-9de7-f336def1c7a1/volumes" Feb 17 13:47:24 crc kubenswrapper[4804]: I0217 13:47:24.873629 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-58989b55cb-zjfvf" podUID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Feb 17 13:47:25 crc kubenswrapper[4804]: I0217 13:47:25.302298 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da1535f5-a225-489d-af6d-cbfa6042d239","Type":"ContainerStarted","Data":"1aabbda01fd4eb2f80fc3bbf09b7a922e4856861abbe2a6d98b91344740bf141"} Feb 17 13:47:25 crc kubenswrapper[4804]: I0217 13:47:25.302910 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da1535f5-a225-489d-af6d-cbfa6042d239","Type":"ContainerStarted","Data":"4b8e0eba24a3942ba5514c36c6f889a4738be910f9bdb4e385c1915be330d79c"} Feb 17 13:47:25 crc kubenswrapper[4804]: I0217 13:47:25.304882 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6","Type":"ContainerStarted","Data":"723476fd1d8f467255808440fe7e8799143ee2007a7f138345fcc04e2663bf99"} Feb 17 13:47:25 crc kubenswrapper[4804]: I0217 13:47:25.306790 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0586d6d2-92ba-4c34-9153-3de3fe22add2","Type":"ContainerStarted","Data":"761183841cb4a6313b54fea97dc2892653f9c6938944cc406dcf275dfc1eb3c6"} Feb 17 13:47:25 crc kubenswrapper[4804]: I0217 13:47:25.306954 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="0586d6d2-92ba-4c34-9153-3de3fe22add2" containerName="cinder-api-log" containerID="cri-o://75dd018c8e9cd8c6677bcd27c77a5f1fcc5d2bc1ba3b553480e69ca53b36f5e6" gracePeriod=30 Feb 17 13:47:25 crc kubenswrapper[4804]: I0217 13:47:25.306970 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="0586d6d2-92ba-4c34-9153-3de3fe22add2" containerName="cinder-api" containerID="cri-o://761183841cb4a6313b54fea97dc2892653f9c6938944cc406dcf275dfc1eb3c6" gracePeriod=30 Feb 17 13:47:25 crc kubenswrapper[4804]: I0217 13:47:25.332312 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.564690298 podStartE2EDuration="4.332295785s" podCreationTimestamp="2026-02-17 13:47:21 +0000 UTC" firstStartedPulling="2026-02-17 13:47:22.21823624 +0000 UTC m=+1316.329655577" lastFinishedPulling="2026-02-17 13:47:22.985841727 +0000 UTC m=+1317.097261064" observedRunningTime="2026-02-17 13:47:25.324550802 +0000 UTC m=+1319.435970139" watchObservedRunningTime="2026-02-17 13:47:25.332295785 +0000 UTC m=+1319.443715122" Feb 17 13:47:25 crc kubenswrapper[4804]: I0217 13:47:25.356050 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.356028932 podStartE2EDuration="4.356028932s" podCreationTimestamp="2026-02-17 13:47:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:25.351406416 +0000 UTC m=+1319.462825753" watchObservedRunningTime="2026-02-17 13:47:25.356028932 +0000 UTC m=+1319.467448269" Feb 17 13:47:26 crc kubenswrapper[4804]: I0217 13:47:26.318417 4804 generic.go:334] "Generic (PLEG): container finished" podID="0586d6d2-92ba-4c34-9153-3de3fe22add2" containerID="761183841cb4a6313b54fea97dc2892653f9c6938944cc406dcf275dfc1eb3c6" exitCode=0 Feb 17 13:47:26 crc kubenswrapper[4804]: I0217 13:47:26.318466 4804 generic.go:334] "Generic (PLEG): container finished" podID="0586d6d2-92ba-4c34-9153-3de3fe22add2" containerID="75dd018c8e9cd8c6677bcd27c77a5f1fcc5d2bc1ba3b553480e69ca53b36f5e6" exitCode=143 Feb 17 13:47:26 crc kubenswrapper[4804]: I0217 13:47:26.318504 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0586d6d2-92ba-4c34-9153-3de3fe22add2","Type":"ContainerDied","Data":"761183841cb4a6313b54fea97dc2892653f9c6938944cc406dcf275dfc1eb3c6"} Feb 17 13:47:26 crc kubenswrapper[4804]: I0217 13:47:26.318566 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0586d6d2-92ba-4c34-9153-3de3fe22add2","Type":"ContainerDied","Data":"75dd018c8e9cd8c6677bcd27c77a5f1fcc5d2bc1ba3b553480e69ca53b36f5e6"} Feb 17 13:47:26 crc kubenswrapper[4804]: I0217 13:47:26.778444 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 17 13:47:26 crc kubenswrapper[4804]: I0217 13:47:26.822026 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:26 crc kubenswrapper[4804]: I0217 13:47:26.874857 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6955855558-kv2ld"] Feb 17 13:47:26 crc kubenswrapper[4804]: I0217 13:47:26.875078 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6955855558-kv2ld" podUID="410af4be-4a66-404d-9809-f58444bc6473" containerName="barbican-api-log" containerID="cri-o://3225329e6ab93eac20c2d9227e1f3df46c5bdbdd2affe08906fba20733bf989a" gracePeriod=30 Feb 17 13:47:26 crc kubenswrapper[4804]: I0217 13:47:26.875457 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6955855558-kv2ld" podUID="410af4be-4a66-404d-9809-f58444bc6473" containerName="barbican-api" containerID="cri-o://7797ca6ae3158f51032a378dceeadfa4a4aab48a558972be10499d12d6917e06" gracePeriod=30 Feb 17 13:47:26 crc kubenswrapper[4804]: I0217 13:47:26.967232 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.198430 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.316749 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-77797bd57-r2gff"] Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.316982 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-77797bd57-r2gff" podUID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerName="neutron-api" containerID="cri-o://c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef" gracePeriod=30 Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.317586 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-77797bd57-r2gff" podUID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerName="neutron-httpd" containerID="cri-o://b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c" gracePeriod=30 Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.323595 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.357175 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c576cfd85-655nj"] Feb 17 13:47:27 crc kubenswrapper[4804]: E0217 13:47:27.357652 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0586d6d2-92ba-4c34-9153-3de3fe22add2" containerName="cinder-api" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.357672 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0586d6d2-92ba-4c34-9153-3de3fe22add2" containerName="cinder-api" Feb 17 13:47:27 crc kubenswrapper[4804]: E0217 13:47:27.357707 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0586d6d2-92ba-4c34-9153-3de3fe22add2" containerName="cinder-api-log" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.357715 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0586d6d2-92ba-4c34-9153-3de3fe22add2" containerName="cinder-api-log" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.357929 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="0586d6d2-92ba-4c34-9153-3de3fe22add2" containerName="cinder-api-log" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.357948 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="0586d6d2-92ba-4c34-9153-3de3fe22add2" containerName="cinder-api" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.359028 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.359775 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da1535f5-a225-489d-af6d-cbfa6042d239","Type":"ContainerStarted","Data":"092874b0b3e14392b931fcb3901d9071706161752bbb5877b56ec700010be97b"} Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.367054 4804 generic.go:334] "Generic (PLEG): container finished" podID="410af4be-4a66-404d-9809-f58444bc6473" containerID="3225329e6ab93eac20c2d9227e1f3df46c5bdbdd2affe08906fba20733bf989a" exitCode=143 Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.367100 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6955855558-kv2ld" event={"ID":"410af4be-4a66-404d-9809-f58444bc6473","Type":"ContainerDied","Data":"3225329e6ab93eac20c2d9227e1f3df46c5bdbdd2affe08906fba20733bf989a"} Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.378506 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.378745 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0586d6d2-92ba-4c34-9153-3de3fe22add2","Type":"ContainerDied","Data":"3fb4a87b299d4238d785f3f50f31c965eb4578228964e6ab3bbb8b0fd289c1da"} Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.378785 4804 scope.go:117] "RemoveContainer" containerID="761183841cb4a6313b54fea97dc2892653f9c6938944cc406dcf275dfc1eb3c6" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.381474 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data-custom\") pod \"0586d6d2-92ba-4c34-9153-3de3fe22add2\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.381549 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-combined-ca-bundle\") pod \"0586d6d2-92ba-4c34-9153-3de3fe22add2\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.381865 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data\") pod \"0586d6d2-92ba-4c34-9153-3de3fe22add2\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.381914 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-scripts\") pod \"0586d6d2-92ba-4c34-9153-3de3fe22add2\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.382024 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0586d6d2-92ba-4c34-9153-3de3fe22add2-etc-machine-id\") pod \"0586d6d2-92ba-4c34-9153-3de3fe22add2\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.382131 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfrmd\" (UniqueName: \"kubernetes.io/projected/0586d6d2-92ba-4c34-9153-3de3fe22add2-kube-api-access-wfrmd\") pod \"0586d6d2-92ba-4c34-9153-3de3fe22add2\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.382240 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0586d6d2-92ba-4c34-9153-3de3fe22add2-logs\") pod \"0586d6d2-92ba-4c34-9153-3de3fe22add2\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.382538 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0586d6d2-92ba-4c34-9153-3de3fe22add2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0586d6d2-92ba-4c34-9153-3de3fe22add2" (UID: "0586d6d2-92ba-4c34-9153-3de3fe22add2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.383332 4804 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0586d6d2-92ba-4c34-9153-3de3fe22add2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.383962 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0586d6d2-92ba-4c34-9153-3de3fe22add2-logs" (OuterVolumeSpecName: "logs") pod "0586d6d2-92ba-4c34-9153-3de3fe22add2" (UID: "0586d6d2-92ba-4c34-9153-3de3fe22add2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.384038 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c576cfd85-655nj"] Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.391102 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0586d6d2-92ba-4c34-9153-3de3fe22add2" (UID: "0586d6d2-92ba-4c34-9153-3de3fe22add2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.391132 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-scripts" (OuterVolumeSpecName: "scripts") pod "0586d6d2-92ba-4c34-9153-3de3fe22add2" (UID: "0586d6d2-92ba-4c34-9153-3de3fe22add2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.392439 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0586d6d2-92ba-4c34-9153-3de3fe22add2-kube-api-access-wfrmd" (OuterVolumeSpecName: "kube-api-access-wfrmd") pod "0586d6d2-92ba-4c34-9153-3de3fe22add2" (UID: "0586d6d2-92ba-4c34-9153-3de3fe22add2"). InnerVolumeSpecName "kube-api-access-wfrmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.428223 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-77797bd57-r2gff" podUID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.155:9696/\": read tcp 10.217.0.2:54846->10.217.0.155:9696: read: connection reset by peer" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.439835 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0586d6d2-92ba-4c34-9153-3de3fe22add2" (UID: "0586d6d2-92ba-4c34-9153-3de3fe22add2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.468356 4804 scope.go:117] "RemoveContainer" containerID="75dd018c8e9cd8c6677bcd27c77a5f1fcc5d2bc1ba3b553480e69ca53b36f5e6" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.483978 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data" (OuterVolumeSpecName: "config-data") pod "0586d6d2-92ba-4c34-9153-3de3fe22add2" (UID: "0586d6d2-92ba-4c34-9153-3de3fe22add2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.484718 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-combined-ca-bundle\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.484775 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-config\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.484817 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nnts\" (UniqueName: \"kubernetes.io/projected/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-kube-api-access-2nnts\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.484893 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-httpd-config\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.484958 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-public-tls-certs\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.484998 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-internal-tls-certs\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.485075 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-ovndb-tls-certs\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.485143 4804 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.485155 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.485163 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.485171 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.485180 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfrmd\" (UniqueName: \"kubernetes.io/projected/0586d6d2-92ba-4c34-9153-3de3fe22add2-kube-api-access-wfrmd\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.485190 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0586d6d2-92ba-4c34-9153-3de3fe22add2-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.591263 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-public-tls-certs\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.591319 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-internal-tls-certs\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.591369 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-ovndb-tls-certs\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.591403 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-combined-ca-bundle\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.591429 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-config\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.591460 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nnts\" (UniqueName: \"kubernetes.io/projected/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-kube-api-access-2nnts\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.591500 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-httpd-config\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.596781 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-combined-ca-bundle\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.597096 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-httpd-config\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.597217 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-public-tls-certs\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.597977 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-config\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.601416 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-ovndb-tls-certs\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.602137 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-internal-tls-certs\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.610968 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nnts\" (UniqueName: \"kubernetes.io/projected/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-kube-api-access-2nnts\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.684006 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.724368 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.741300 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.807797 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.810994 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.816288 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.816605 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.816708 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.819610 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.907669 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-etc-machine-id\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.907741 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.907778 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-public-tls-certs\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.907815 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmhsk\" (UniqueName: \"kubernetes.io/projected/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-kube-api-access-mmhsk\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.907888 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-config-data\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.907935 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-config-data-custom\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.907961 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-scripts\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.908010 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.908188 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-logs\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.010187 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-logs\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.010384 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-etc-machine-id\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.010422 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.010451 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-public-tls-certs\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.010487 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmhsk\" (UniqueName: \"kubernetes.io/projected/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-kube-api-access-mmhsk\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.010518 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-config-data\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.010560 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-config-data-custom\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.010606 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-scripts\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.010457 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-etc-machine-id\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.010675 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.013998 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-logs\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.017702 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-scripts\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.017868 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-public-tls-certs\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.019793 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.021788 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-config-data\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.030782 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-config-data-custom\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.031865 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.037789 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmhsk\" (UniqueName: \"kubernetes.io/projected/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-kube-api-access-mmhsk\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.136550 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.338179 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c576cfd85-655nj"] Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.408550 4804 generic.go:334] "Generic (PLEG): container finished" podID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerID="b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c" exitCode=0 Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.408742 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77797bd57-r2gff" event={"ID":"3dd4a1b7-336a-4b57-a341-a413ccd8a223","Type":"ContainerDied","Data":"b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c"} Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.411733 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c576cfd85-655nj" event={"ID":"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66","Type":"ContainerStarted","Data":"80395f7cf08a3f295690844faf186f0d36b5ab94d7a70807624bfa83ff416d77"} Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.590777 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0586d6d2-92ba-4c34-9153-3de3fe22add2" path="/var/lib/kubelet/pods/0586d6d2-92ba-4c34-9153-3de3fe22add2/volumes" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.701537 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.984587 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-77797bd57-r2gff" podUID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.155:9696/\": dial tcp 10.217.0.155:9696: connect: connection refused" Feb 17 13:47:29 crc kubenswrapper[4804]: I0217 13:47:29.454375 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92","Type":"ContainerStarted","Data":"7efbb62ff61c35ea16cabda7c84e1279ca8d2af07fb7491fd2094190ad1846f1"} Feb 17 13:47:29 crc kubenswrapper[4804]: I0217 13:47:29.489487 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da1535f5-a225-489d-af6d-cbfa6042d239","Type":"ContainerStarted","Data":"99fe5ef4d5a27697bd3d835ca4e7242c9ea11c1b7e9ff93b4ae3d3d3447f90ca"} Feb 17 13:47:29 crc kubenswrapper[4804]: I0217 13:47:29.508459 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c576cfd85-655nj" event={"ID":"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66","Type":"ContainerStarted","Data":"32bc21f5abd2fb63b5c3e9a028dd07b9e583eb42bdf92ccbe33b8b4f924c450d"} Feb 17 13:47:29 crc kubenswrapper[4804]: I0217 13:47:29.508687 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c576cfd85-655nj" event={"ID":"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66","Type":"ContainerStarted","Data":"6cdee8a2a44746e99685b57fd02fda0803227405c0ea737ecba590dd9cb4f9d0"} Feb 17 13:47:29 crc kubenswrapper[4804]: I0217 13:47:29.510222 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:29 crc kubenswrapper[4804]: I0217 13:47:29.541885 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-c576cfd85-655nj" podStartSLOduration=2.541864628 podStartE2EDuration="2.541864628s" podCreationTimestamp="2026-02-17 13:47:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:29.535667122 +0000 UTC m=+1323.647086459" watchObservedRunningTime="2026-02-17 13:47:29.541864628 +0000 UTC m=+1323.653283965" Feb 17 13:47:29 crc kubenswrapper[4804]: I0217 13:47:29.602076 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:47:29 crc kubenswrapper[4804]: I0217 13:47:29.668215 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58989b55cb-zjfvf"] Feb 17 13:47:29 crc kubenswrapper[4804]: I0217 13:47:29.668883 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-58989b55cb-zjfvf" podUID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" containerName="horizon-log" containerID="cri-o://d85afc401ad87104d844d4c1c5c56bfe2224eb996820680ca9a6f48ab88469e3" gracePeriod=30 Feb 17 13:47:29 crc kubenswrapper[4804]: I0217 13:47:29.669321 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-58989b55cb-zjfvf" podUID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" containerName="horizon" containerID="cri-o://c565845aca9ef2b15231e4cf93626b2f7262c579528562e984d56c20dda93983" gracePeriod=30 Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.107980 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6955855558-kv2ld" podUID="410af4be-4a66-404d-9809-f58444bc6473" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:54514->10.217.0.163:9311: read: connection reset by peer" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.108034 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6955855558-kv2ld" podUID="410af4be-4a66-404d-9809-f58444bc6473" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:54504->10.217.0.163:9311: read: connection reset by peer" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.519275 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92","Type":"ContainerStarted","Data":"c7468c9ea7959857ebbdf390fc902dc07aa8ed275c14ba41579ead40e9970dbe"} Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.521510 4804 generic.go:334] "Generic (PLEG): container finished" podID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" containerID="c565845aca9ef2b15231e4cf93626b2f7262c579528562e984d56c20dda93983" exitCode=0 Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.521570 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58989b55cb-zjfvf" event={"ID":"85415d6a-8a5f-4b65-b182-2bfe221e8eee","Type":"ContainerDied","Data":"c565845aca9ef2b15231e4cf93626b2f7262c579528562e984d56c20dda93983"} Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.525610 4804 generic.go:334] "Generic (PLEG): container finished" podID="410af4be-4a66-404d-9809-f58444bc6473" containerID="7797ca6ae3158f51032a378dceeadfa4a4aab48a558972be10499d12d6917e06" exitCode=0 Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.526538 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6955855558-kv2ld" event={"ID":"410af4be-4a66-404d-9809-f58444bc6473","Type":"ContainerDied","Data":"7797ca6ae3158f51032a378dceeadfa4a4aab48a558972be10499d12d6917e06"} Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.728814 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.879299 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/410af4be-4a66-404d-9809-f58444bc6473-logs\") pod \"410af4be-4a66-404d-9809-f58444bc6473\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.879459 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data-custom\") pod \"410af4be-4a66-404d-9809-f58444bc6473\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.879657 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-combined-ca-bundle\") pod \"410af4be-4a66-404d-9809-f58444bc6473\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.880001 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/410af4be-4a66-404d-9809-f58444bc6473-logs" (OuterVolumeSpecName: "logs") pod "410af4be-4a66-404d-9809-f58444bc6473" (UID: "410af4be-4a66-404d-9809-f58444bc6473"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.880525 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data\") pod \"410af4be-4a66-404d-9809-f58444bc6473\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.880613 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks729\" (UniqueName: \"kubernetes.io/projected/410af4be-4a66-404d-9809-f58444bc6473-kube-api-access-ks729\") pod \"410af4be-4a66-404d-9809-f58444bc6473\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.881212 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/410af4be-4a66-404d-9809-f58444bc6473-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.907949 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "410af4be-4a66-404d-9809-f58444bc6473" (UID: "410af4be-4a66-404d-9809-f58444bc6473"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.911516 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/410af4be-4a66-404d-9809-f58444bc6473-kube-api-access-ks729" (OuterVolumeSpecName: "kube-api-access-ks729") pod "410af4be-4a66-404d-9809-f58444bc6473" (UID: "410af4be-4a66-404d-9809-f58444bc6473"). InnerVolumeSpecName "kube-api-access-ks729". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.932569 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "410af4be-4a66-404d-9809-f58444bc6473" (UID: "410af4be-4a66-404d-9809-f58444bc6473"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.958870 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data" (OuterVolumeSpecName: "config-data") pod "410af4be-4a66-404d-9809-f58444bc6473" (UID: "410af4be-4a66-404d-9809-f58444bc6473"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.983333 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks729\" (UniqueName: \"kubernetes.io/projected/410af4be-4a66-404d-9809-f58444bc6473-kube-api-access-ks729\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.983386 4804 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.983397 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.983406 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.540359 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6955855558-kv2ld" event={"ID":"410af4be-4a66-404d-9809-f58444bc6473","Type":"ContainerDied","Data":"0345398b72b1e224ae72d70254cce0d2edce23c11bae1237519d2ed6af3adbe1"} Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.540773 4804 scope.go:117] "RemoveContainer" containerID="7797ca6ae3158f51032a378dceeadfa4a4aab48a558972be10499d12d6917e06" Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.540413 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.543683 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92","Type":"ContainerStarted","Data":"67d0f359ca6c010971e9a845ebb55aaabb794027eca2c4ed6c73d3a10f7dc586"} Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.543902 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.550999 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da1535f5-a225-489d-af6d-cbfa6042d239","Type":"ContainerStarted","Data":"35ecd78bd9969e0d81d74d255802d84e8782104abf917998c0f5c9f7a1d3435a"} Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.551058 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.579306 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.579286631 podStartE2EDuration="4.579286631s" podCreationTimestamp="2026-02-17 13:47:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:31.574296784 +0000 UTC m=+1325.685716151" watchObservedRunningTime="2026-02-17 13:47:31.579286631 +0000 UTC m=+1325.690705978" Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.603447 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.228039787 podStartE2EDuration="8.603427889s" podCreationTimestamp="2026-02-17 13:47:23 +0000 UTC" firstStartedPulling="2026-02-17 13:47:24.605360165 +0000 UTC m=+1318.716779502" lastFinishedPulling="2026-02-17 13:47:30.980748267 +0000 UTC m=+1325.092167604" observedRunningTime="2026-02-17 13:47:31.601995055 +0000 UTC m=+1325.713414412" watchObservedRunningTime="2026-02-17 13:47:31.603427889 +0000 UTC m=+1325.714847226" Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.621399 4804 scope.go:117] "RemoveContainer" containerID="3225329e6ab93eac20c2d9227e1f3df46c5bdbdd2affe08906fba20733bf989a" Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.630174 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6955855558-kv2ld"] Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.637698 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6955855558-kv2ld"] Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.867438 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.984632 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-cxg2s"] Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.985140 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" podUID="b85d5058-b075-42ca-8d69-a86cfc1bd01c" containerName="dnsmasq-dns" containerID="cri-o://d980c32d83966a44bf55958cd6329cf2bd80a3344dee8dab3f8264dc4b275f31" gracePeriod=10 Feb 17 13:47:32 crc kubenswrapper[4804]: I0217 13:47:32.324211 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 17 13:47:32 crc kubenswrapper[4804]: I0217 13:47:32.384647 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" podUID="b85d5058-b075-42ca-8d69-a86cfc1bd01c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.162:5353: connect: connection refused" Feb 17 13:47:32 crc kubenswrapper[4804]: I0217 13:47:32.408174 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 13:47:32 crc kubenswrapper[4804]: I0217 13:47:32.561634 4804 generic.go:334] "Generic (PLEG): container finished" podID="b85d5058-b075-42ca-8d69-a86cfc1bd01c" containerID="d980c32d83966a44bf55958cd6329cf2bd80a3344dee8dab3f8264dc4b275f31" exitCode=0 Feb 17 13:47:32 crc kubenswrapper[4804]: I0217 13:47:32.562365 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" event={"ID":"b85d5058-b075-42ca-8d69-a86cfc1bd01c","Type":"ContainerDied","Data":"d980c32d83966a44bf55958cd6329cf2bd80a3344dee8dab3f8264dc4b275f31"} Feb 17 13:47:32 crc kubenswrapper[4804]: I0217 13:47:32.562891 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" containerName="cinder-scheduler" containerID="cri-o://2b63a870c70f085dee0bf900b7beba65015e3aff6e6541b29544712e34dd77a9" gracePeriod=30 Feb 17 13:47:32 crc kubenswrapper[4804]: I0217 13:47:32.563086 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" containerName="probe" containerID="cri-o://723476fd1d8f467255808440fe7e8799143ee2007a7f138345fcc04e2663bf99" gracePeriod=30 Feb 17 13:47:32 crc kubenswrapper[4804]: I0217 13:47:32.591625 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="410af4be-4a66-404d-9809-f58444bc6473" path="/var/lib/kubelet/pods/410af4be-4a66-404d-9809-f58444bc6473/volumes" Feb 17 13:47:32 crc kubenswrapper[4804]: I0217 13:47:32.995121 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.129451 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-nb\") pod \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.130288 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-svc\") pod \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.130793 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fgvs\" (UniqueName: \"kubernetes.io/projected/b85d5058-b075-42ca-8d69-a86cfc1bd01c-kube-api-access-4fgvs\") pod \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.130902 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-swift-storage-0\") pod \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.131066 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-config\") pod \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.131216 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-sb\") pod \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.139618 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b85d5058-b075-42ca-8d69-a86cfc1bd01c-kube-api-access-4fgvs" (OuterVolumeSpecName: "kube-api-access-4fgvs") pod "b85d5058-b075-42ca-8d69-a86cfc1bd01c" (UID: "b85d5058-b075-42ca-8d69-a86cfc1bd01c"). InnerVolumeSpecName "kube-api-access-4fgvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.180561 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b85d5058-b075-42ca-8d69-a86cfc1bd01c" (UID: "b85d5058-b075-42ca-8d69-a86cfc1bd01c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.207004 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b85d5058-b075-42ca-8d69-a86cfc1bd01c" (UID: "b85d5058-b075-42ca-8d69-a86cfc1bd01c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.210806 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-config" (OuterVolumeSpecName: "config") pod "b85d5058-b075-42ca-8d69-a86cfc1bd01c" (UID: "b85d5058-b075-42ca-8d69-a86cfc1bd01c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.234052 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fgvs\" (UniqueName: \"kubernetes.io/projected/b85d5058-b075-42ca-8d69-a86cfc1bd01c-kube-api-access-4fgvs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.234087 4804 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.234097 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.234105 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.234693 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b85d5058-b075-42ca-8d69-a86cfc1bd01c" (UID: "b85d5058-b075-42ca-8d69-a86cfc1bd01c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.252831 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b85d5058-b075-42ca-8d69-a86cfc1bd01c" (UID: "b85d5058-b075-42ca-8d69-a86cfc1bd01c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.335699 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.335732 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.552717 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.572830 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.572892 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" event={"ID":"b85d5058-b075-42ca-8d69-a86cfc1bd01c","Type":"ContainerDied","Data":"1ceb04ce2633cdf168f7ec2c7223a7b5436da513a4112c0b4cec53ae79c55d6e"} Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.573727 4804 scope.go:117] "RemoveContainer" containerID="d980c32d83966a44bf55958cd6329cf2bd80a3344dee8dab3f8264dc4b275f31" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.580963 4804 generic.go:334] "Generic (PLEG): container finished" podID="32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" containerID="723476fd1d8f467255808440fe7e8799143ee2007a7f138345fcc04e2663bf99" exitCode=0 Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.581274 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6","Type":"ContainerDied","Data":"723476fd1d8f467255808440fe7e8799143ee2007a7f138345fcc04e2663bf99"} Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.587520 4804 generic.go:334] "Generic (PLEG): container finished" podID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerID="c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef" exitCode=0 Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.587573 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77797bd57-r2gff" event={"ID":"3dd4a1b7-336a-4b57-a341-a413ccd8a223","Type":"ContainerDied","Data":"c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef"} Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.587594 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.587608 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77797bd57-r2gff" event={"ID":"3dd4a1b7-336a-4b57-a341-a413ccd8a223","Type":"ContainerDied","Data":"b27dcb323e9a77c57f04bfd3aad2ceaaa35b5cea105117b952a32b3cda64f464"} Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.604644 4804 scope.go:117] "RemoveContainer" containerID="61ec7f44479bb9daa4c9c91948a35a994fd304cc644764be5a4bbb119a672347" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.635283 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-cxg2s"] Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.640349 4804 scope.go:117] "RemoveContainer" containerID="b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.647670 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-cxg2s"] Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.659211 4804 scope.go:117] "RemoveContainer" containerID="c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.685173 4804 scope.go:117] "RemoveContainer" containerID="b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c" Feb 17 13:47:33 crc kubenswrapper[4804]: E0217 13:47:33.685653 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c\": container with ID starting with b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c not found: ID does not exist" containerID="b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.685697 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c"} err="failed to get container status \"b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c\": rpc error: code = NotFound desc = could not find container \"b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c\": container with ID starting with b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c not found: ID does not exist" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.685719 4804 scope.go:117] "RemoveContainer" containerID="c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef" Feb 17 13:47:33 crc kubenswrapper[4804]: E0217 13:47:33.686258 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef\": container with ID starting with c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef not found: ID does not exist" containerID="c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.686283 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef"} err="failed to get container status \"c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef\": rpc error: code = NotFound desc = could not find container \"c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef\": container with ID starting with c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef not found: ID does not exist" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.742163 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-ovndb-tls-certs\") pod \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.742276 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-public-tls-certs\") pod \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.742304 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-httpd-config\") pod \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.742331 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-internal-tls-certs\") pod \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.742358 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-combined-ca-bundle\") pod \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.742494 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6642c\" (UniqueName: \"kubernetes.io/projected/3dd4a1b7-336a-4b57-a341-a413ccd8a223-kube-api-access-6642c\") pod \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.742567 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-config\") pod \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.757173 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd4a1b7-336a-4b57-a341-a413ccd8a223-kube-api-access-6642c" (OuterVolumeSpecName: "kube-api-access-6642c") pod "3dd4a1b7-336a-4b57-a341-a413ccd8a223" (UID: "3dd4a1b7-336a-4b57-a341-a413ccd8a223"). InnerVolumeSpecName "kube-api-access-6642c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.760072 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "3dd4a1b7-336a-4b57-a341-a413ccd8a223" (UID: "3dd4a1b7-336a-4b57-a341-a413ccd8a223"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.786497 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-config" (OuterVolumeSpecName: "config") pod "3dd4a1b7-336a-4b57-a341-a413ccd8a223" (UID: "3dd4a1b7-336a-4b57-a341-a413ccd8a223"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.792113 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3dd4a1b7-336a-4b57-a341-a413ccd8a223" (UID: "3dd4a1b7-336a-4b57-a341-a413ccd8a223"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.806733 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3dd4a1b7-336a-4b57-a341-a413ccd8a223" (UID: "3dd4a1b7-336a-4b57-a341-a413ccd8a223"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.807307 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3dd4a1b7-336a-4b57-a341-a413ccd8a223" (UID: "3dd4a1b7-336a-4b57-a341-a413ccd8a223"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.819792 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "3dd4a1b7-336a-4b57-a341-a413ccd8a223" (UID: "3dd4a1b7-336a-4b57-a341-a413ccd8a223"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.846374 4804 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.846568 4804 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.846680 4804 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.846753 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.846812 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6642c\" (UniqueName: \"kubernetes.io/projected/3dd4a1b7-336a-4b57-a341-a413ccd8a223-kube-api-access-6642c\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.846867 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.846996 4804 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:34 crc kubenswrapper[4804]: I0217 13:47:34.006667 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-77797bd57-r2gff"] Feb 17 13:47:34 crc kubenswrapper[4804]: I0217 13:47:34.016416 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-77797bd57-r2gff"] Feb 17 13:47:34 crc kubenswrapper[4804]: I0217 13:47:34.589402 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" path="/var/lib/kubelet/pods/3dd4a1b7-336a-4b57-a341-a413ccd8a223/volumes" Feb 17 13:47:34 crc kubenswrapper[4804]: I0217 13:47:34.590471 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b85d5058-b075-42ca-8d69-a86cfc1bd01c" path="/var/lib/kubelet/pods/b85d5058-b075-42ca-8d69-a86cfc1bd01c/volumes" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.629704 4804 generic.go:334] "Generic (PLEG): container finished" podID="32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" containerID="2b63a870c70f085dee0bf900b7beba65015e3aff6e6541b29544712e34dd77a9" exitCode=0 Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.629808 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6","Type":"ContainerDied","Data":"2b63a870c70f085dee0bf900b7beba65015e3aff6e6541b29544712e34dd77a9"} Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.630303 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6","Type":"ContainerDied","Data":"2618a56a4b1417c4a63c4fff93e5f2af5e701449d4dbe686563ceaa84785f504"} Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.630320 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2618a56a4b1417c4a63c4fff93e5f2af5e701449d4dbe686563ceaa84785f504" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.635990 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.799772 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-etc-machine-id\") pod \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.799950 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-scripts\") pod \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.800042 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data-custom\") pod \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.800081 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data\") pod \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.800117 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-combined-ca-bundle\") pod \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.800179 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4p6n\" (UniqueName: \"kubernetes.io/projected/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-kube-api-access-t4p6n\") pod \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.801192 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" (UID: "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.807725 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-scripts" (OuterVolumeSpecName: "scripts") pod "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" (UID: "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.810511 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-kube-api-access-t4p6n" (OuterVolumeSpecName: "kube-api-access-t4p6n") pod "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" (UID: "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6"). InnerVolumeSpecName "kube-api-access-t4p6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.830421 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" (UID: "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.883326 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" (UID: "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.905454 4804 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.905494 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.905502 4804 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.905511 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.905519 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4p6n\" (UniqueName: \"kubernetes.io/projected/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-kube-api-access-t4p6n\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.919318 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data" (OuterVolumeSpecName: "config-data") pod "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" (UID: "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.007481 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.637785 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.669450 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.675865 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.694563 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 13:47:37 crc kubenswrapper[4804]: E0217 13:47:37.695017 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerName="neutron-api" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695042 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerName="neutron-api" Feb 17 13:47:37 crc kubenswrapper[4804]: E0217 13:47:37.695066 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" containerName="cinder-scheduler" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695075 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" containerName="cinder-scheduler" Feb 17 13:47:37 crc kubenswrapper[4804]: E0217 13:47:37.695093 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b85d5058-b075-42ca-8d69-a86cfc1bd01c" containerName="dnsmasq-dns" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695101 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="b85d5058-b075-42ca-8d69-a86cfc1bd01c" containerName="dnsmasq-dns" Feb 17 13:47:37 crc kubenswrapper[4804]: E0217 13:47:37.695116 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410af4be-4a66-404d-9809-f58444bc6473" containerName="barbican-api-log" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695125 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="410af4be-4a66-404d-9809-f58444bc6473" containerName="barbican-api-log" Feb 17 13:47:37 crc kubenswrapper[4804]: E0217 13:47:37.695137 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410af4be-4a66-404d-9809-f58444bc6473" containerName="barbican-api" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695144 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="410af4be-4a66-404d-9809-f58444bc6473" containerName="barbican-api" Feb 17 13:47:37 crc kubenswrapper[4804]: E0217 13:47:37.695160 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b85d5058-b075-42ca-8d69-a86cfc1bd01c" containerName="init" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695168 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="b85d5058-b075-42ca-8d69-a86cfc1bd01c" containerName="init" Feb 17 13:47:37 crc kubenswrapper[4804]: E0217 13:47:37.695179 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" containerName="probe" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695186 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" containerName="probe" Feb 17 13:47:37 crc kubenswrapper[4804]: E0217 13:47:37.695220 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerName="neutron-httpd" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695230 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerName="neutron-httpd" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695440 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerName="neutron-httpd" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695456 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="410af4be-4a66-404d-9809-f58444bc6473" containerName="barbican-api-log" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695471 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" containerName="cinder-scheduler" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695482 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerName="neutron-api" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695498 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="410af4be-4a66-404d-9809-f58444bc6473" containerName="barbican-api" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695505 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" containerName="probe" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695517 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="b85d5058-b075-42ca-8d69-a86cfc1bd01c" containerName="dnsmasq-dns" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.696847 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.701491 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.710047 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.822924 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-scripts\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.823037 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f7170af0-a08f-4b96-b93a-5353d633a82f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.823095 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.823155 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlbsm\" (UniqueName: \"kubernetes.io/projected/f7170af0-a08f-4b96-b93a-5353d633a82f-kube-api-access-rlbsm\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.823190 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-config-data\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.823306 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.925494 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.925588 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlbsm\" (UniqueName: \"kubernetes.io/projected/f7170af0-a08f-4b96-b93a-5353d633a82f-kube-api-access-rlbsm\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.925624 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-config-data\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.925646 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.925748 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-scripts\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.925810 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f7170af0-a08f-4b96-b93a-5353d633a82f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.925915 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f7170af0-a08f-4b96-b93a-5353d633a82f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.931420 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.932735 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-config-data\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.938626 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.940983 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-scripts\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.945242 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlbsm\" (UniqueName: \"kubernetes.io/projected/f7170af0-a08f-4b96-b93a-5353d633a82f-kube-api-access-rlbsm\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:38 crc kubenswrapper[4804]: I0217 13:47:38.021700 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 13:47:38 crc kubenswrapper[4804]: I0217 13:47:38.084906 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:38 crc kubenswrapper[4804]: I0217 13:47:38.282801 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:38 crc kubenswrapper[4804]: I0217 13:47:38.290833 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:38 crc kubenswrapper[4804]: I0217 13:47:38.317563 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:38 crc kubenswrapper[4804]: I0217 13:47:38.398604 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:38 crc kubenswrapper[4804]: I0217 13:47:38.415422 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-67b49bc6f6-6kg64"] Feb 17 13:47:38 crc kubenswrapper[4804]: I0217 13:47:38.594177 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" path="/var/lib/kubelet/pods/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6/volumes" Feb 17 13:47:38 crc kubenswrapper[4804]: I0217 13:47:38.616689 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 13:47:38 crc kubenswrapper[4804]: W0217 13:47:38.628510 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7170af0_a08f_4b96_b93a_5353d633a82f.slice/crio-a47273205caf6fe4b177704a64df6be1f6f9979e3f90120c9ee4ef23be7b97b4 WatchSource:0}: Error finding container a47273205caf6fe4b177704a64df6be1f6f9979e3f90120c9ee4ef23be7b97b4: Status 404 returned error can't find the container with id a47273205caf6fe4b177704a64df6be1f6f9979e3f90120c9ee4ef23be7b97b4 Feb 17 13:47:38 crc kubenswrapper[4804]: I0217 13:47:38.645845 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f7170af0-a08f-4b96-b93a-5353d633a82f","Type":"ContainerStarted","Data":"a47273205caf6fe4b177704a64df6be1f6f9979e3f90120c9ee4ef23be7b97b4"} Feb 17 13:47:39 crc kubenswrapper[4804]: I0217 13:47:39.658487 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-67b49bc6f6-6kg64" podUID="8c441055-8615-497e-8754-d107b3be24c7" containerName="placement-log" containerID="cri-o://e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec" gracePeriod=30 Feb 17 13:47:39 crc kubenswrapper[4804]: I0217 13:47:39.659192 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f7170af0-a08f-4b96-b93a-5353d633a82f","Type":"ContainerStarted","Data":"11c79fb4cb1dc78a4695196d01c8b73f4a5dea4519d6b0c91d65561adadfed48"} Feb 17 13:47:39 crc kubenswrapper[4804]: I0217 13:47:39.659495 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-67b49bc6f6-6kg64" podUID="8c441055-8615-497e-8754-d107b3be24c7" containerName="placement-api" containerID="cri-o://a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c" gracePeriod=30 Feb 17 13:47:40 crc kubenswrapper[4804]: I0217 13:47:40.446679 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 17 13:47:40 crc kubenswrapper[4804]: I0217 13:47:40.669310 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f7170af0-a08f-4b96-b93a-5353d633a82f","Type":"ContainerStarted","Data":"df5f6549084c31c50fa3a697556bed7273dfc81dfa61851a2013f5aa0e70ef80"} Feb 17 13:47:40 crc kubenswrapper[4804]: I0217 13:47:40.671705 4804 generic.go:334] "Generic (PLEG): container finished" podID="8c441055-8615-497e-8754-d107b3be24c7" containerID="e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec" exitCode=143 Feb 17 13:47:40 crc kubenswrapper[4804]: I0217 13:47:40.671741 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67b49bc6f6-6kg64" event={"ID":"8c441055-8615-497e-8754-d107b3be24c7","Type":"ContainerDied","Data":"e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec"} Feb 17 13:47:40 crc kubenswrapper[4804]: I0217 13:47:40.693640 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.693621127 podStartE2EDuration="3.693621127s" podCreationTimestamp="2026-02-17 13:47:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:40.689600301 +0000 UTC m=+1334.801019668" watchObservedRunningTime="2026-02-17 13:47:40.693621127 +0000 UTC m=+1334.805040464" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.000746 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.006511 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.008811 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-29ss8" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.013452 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.016279 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.019218 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.092258 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.092450 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config-secret\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.092699 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd4fn\" (UniqueName: \"kubernetes.io/projected/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-kube-api-access-vd4fn\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.092763 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.194277 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.194412 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config-secret\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.194463 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd4fn\" (UniqueName: \"kubernetes.io/projected/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-kube-api-access-vd4fn\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.194485 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.195395 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.201992 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.203609 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config-secret\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.211808 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd4fn\" (UniqueName: \"kubernetes.io/projected/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-kube-api-access-vd4fn\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.225041 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.226248 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.236025 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.292400 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.293755 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.301904 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 13:47:41 crc kubenswrapper[4804]: E0217 13:47:41.345523 4804 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 17 13:47:41 crc kubenswrapper[4804]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_e1213875-d9b5-42f3-ab21-54ea5f12ea7c_0(b1bf7e425f2cc4d4e5d4b7d977228afe4aeb114fcc55d710b476387a836b3652): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b1bf7e425f2cc4d4e5d4b7d977228afe4aeb114fcc55d710b476387a836b3652" Netns:"/var/run/netns/616f34e0-73e5-4b7e-a2e0-c50d135f5e9d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=b1bf7e425f2cc4d4e5d4b7d977228afe4aeb114fcc55d710b476387a836b3652;K8S_POD_UID=e1213875-d9b5-42f3-ab21-54ea5f12ea7c" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/e1213875-d9b5-42f3-ab21-54ea5f12ea7c]: expected pod UID "e1213875-d9b5-42f3-ab21-54ea5f12ea7c" but got "de1a53e3-68ce-4ecd-9c0a-80ffce568891" from Kube API Feb 17 13:47:41 crc kubenswrapper[4804]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 13:47:41 crc kubenswrapper[4804]: > Feb 17 13:47:41 crc kubenswrapper[4804]: E0217 13:47:41.345610 4804 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 17 13:47:41 crc kubenswrapper[4804]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_e1213875-d9b5-42f3-ab21-54ea5f12ea7c_0(b1bf7e425f2cc4d4e5d4b7d977228afe4aeb114fcc55d710b476387a836b3652): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b1bf7e425f2cc4d4e5d4b7d977228afe4aeb114fcc55d710b476387a836b3652" Netns:"/var/run/netns/616f34e0-73e5-4b7e-a2e0-c50d135f5e9d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=b1bf7e425f2cc4d4e5d4b7d977228afe4aeb114fcc55d710b476387a836b3652;K8S_POD_UID=e1213875-d9b5-42f3-ab21-54ea5f12ea7c" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/e1213875-d9b5-42f3-ab21-54ea5f12ea7c]: expected pod UID "e1213875-d9b5-42f3-ab21-54ea5f12ea7c" but got "de1a53e3-68ce-4ecd-9c0a-80ffce568891" from Kube API Feb 17 13:47:41 crc kubenswrapper[4804]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 13:47:41 crc kubenswrapper[4804]: > pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.397751 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1a53e3-68ce-4ecd-9c0a-80ffce568891-combined-ca-bundle\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.397807 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/de1a53e3-68ce-4ecd-9c0a-80ffce568891-openstack-config-secret\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.397918 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/de1a53e3-68ce-4ecd-9c0a-80ffce568891-openstack-config\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.398043 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfjjx\" (UniqueName: \"kubernetes.io/projected/de1a53e3-68ce-4ecd-9c0a-80ffce568891-kube-api-access-hfjjx\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.499834 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfjjx\" (UniqueName: \"kubernetes.io/projected/de1a53e3-68ce-4ecd-9c0a-80ffce568891-kube-api-access-hfjjx\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.499927 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1a53e3-68ce-4ecd-9c0a-80ffce568891-combined-ca-bundle\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.499960 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/de1a53e3-68ce-4ecd-9c0a-80ffce568891-openstack-config-secret\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.500030 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/de1a53e3-68ce-4ecd-9c0a-80ffce568891-openstack-config\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.506122 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1a53e3-68ce-4ecd-9c0a-80ffce568891-combined-ca-bundle\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.507223 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/de1a53e3-68ce-4ecd-9c0a-80ffce568891-openstack-config\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.510699 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/de1a53e3-68ce-4ecd-9c0a-80ffce568891-openstack-config-secret\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.525764 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfjjx\" (UniqueName: \"kubernetes.io/projected/de1a53e3-68ce-4ecd-9c0a-80ffce568891-kube-api-access-hfjjx\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.683764 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.687590 4804 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="e1213875-d9b5-42f3-ab21-54ea5f12ea7c" podUID="de1a53e3-68ce-4ecd-9c0a-80ffce568891" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.694334 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.695129 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.812612 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config-secret\") pod \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.814497 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config\") pod \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.814629 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd4fn\" (UniqueName: \"kubernetes.io/projected/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-kube-api-access-vd4fn\") pod \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.814677 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-combined-ca-bundle\") pod \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.815105 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "e1213875-d9b5-42f3-ab21-54ea5f12ea7c" (UID: "e1213875-d9b5-42f3-ab21-54ea5f12ea7c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.815467 4804 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.818602 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-kube-api-access-vd4fn" (OuterVolumeSpecName: "kube-api-access-vd4fn") pod "e1213875-d9b5-42f3-ab21-54ea5f12ea7c" (UID: "e1213875-d9b5-42f3-ab21-54ea5f12ea7c"). InnerVolumeSpecName "kube-api-access-vd4fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.819424 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "e1213875-d9b5-42f3-ab21-54ea5f12ea7c" (UID: "e1213875-d9b5-42f3-ab21-54ea5f12ea7c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.823340 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1213875-d9b5-42f3-ab21-54ea5f12ea7c" (UID: "e1213875-d9b5-42f3-ab21-54ea5f12ea7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.917318 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd4fn\" (UniqueName: \"kubernetes.io/projected/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-kube-api-access-vd4fn\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.917351 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.917359 4804 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:42 crc kubenswrapper[4804]: I0217 13:47:42.193273 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 13:47:42 crc kubenswrapper[4804]: W0217 13:47:42.194315 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde1a53e3_68ce_4ecd_9c0a_80ffce568891.slice/crio-6d7072130547b3c16c17592010f193215c21a3cb4e8a04f7361f9ea5ef5a247f WatchSource:0}: Error finding container 6d7072130547b3c16c17592010f193215c21a3cb4e8a04f7361f9ea5ef5a247f: Status 404 returned error can't find the container with id 6d7072130547b3c16c17592010f193215c21a3cb4e8a04f7361f9ea5ef5a247f Feb 17 13:47:42 crc kubenswrapper[4804]: I0217 13:47:42.584320 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1213875-d9b5-42f3-ab21-54ea5f12ea7c" path="/var/lib/kubelet/pods/e1213875-d9b5-42f3-ab21-54ea5f12ea7c/volumes" Feb 17 13:47:42 crc kubenswrapper[4804]: I0217 13:47:42.705341 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 13:47:42 crc kubenswrapper[4804]: I0217 13:47:42.705333 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"de1a53e3-68ce-4ecd-9c0a-80ffce568891","Type":"ContainerStarted","Data":"6d7072130547b3c16c17592010f193215c21a3cb4e8a04f7361f9ea5ef5a247f"} Feb 17 13:47:42 crc kubenswrapper[4804]: I0217 13:47:42.713854 4804 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="e1213875-d9b5-42f3-ab21-54ea5f12ea7c" podUID="de1a53e3-68ce-4ecd-9c0a-80ffce568891" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.022712 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.260092 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.343101 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-config-data\") pod \"8c441055-8615-497e-8754-d107b3be24c7\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.343158 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-combined-ca-bundle\") pod \"8c441055-8615-497e-8754-d107b3be24c7\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.343264 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c441055-8615-497e-8754-d107b3be24c7-logs\") pod \"8c441055-8615-497e-8754-d107b3be24c7\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.343414 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-public-tls-certs\") pod \"8c441055-8615-497e-8754-d107b3be24c7\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.343439 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsgq9\" (UniqueName: \"kubernetes.io/projected/8c441055-8615-497e-8754-d107b3be24c7-kube-api-access-zsgq9\") pod \"8c441055-8615-497e-8754-d107b3be24c7\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.343509 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-scripts\") pod \"8c441055-8615-497e-8754-d107b3be24c7\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.343581 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-internal-tls-certs\") pod \"8c441055-8615-497e-8754-d107b3be24c7\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.344710 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c441055-8615-497e-8754-d107b3be24c7-logs" (OuterVolumeSpecName: "logs") pod "8c441055-8615-497e-8754-d107b3be24c7" (UID: "8c441055-8615-497e-8754-d107b3be24c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.350872 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c441055-8615-497e-8754-d107b3be24c7-kube-api-access-zsgq9" (OuterVolumeSpecName: "kube-api-access-zsgq9") pod "8c441055-8615-497e-8754-d107b3be24c7" (UID: "8c441055-8615-497e-8754-d107b3be24c7"). InnerVolumeSpecName "kube-api-access-zsgq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.352031 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-scripts" (OuterVolumeSpecName: "scripts") pod "8c441055-8615-497e-8754-d107b3be24c7" (UID: "8c441055-8615-497e-8754-d107b3be24c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.428505 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c441055-8615-497e-8754-d107b3be24c7" (UID: "8c441055-8615-497e-8754-d107b3be24c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.446417 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.446455 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.446468 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c441055-8615-497e-8754-d107b3be24c7-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.446478 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsgq9\" (UniqueName: \"kubernetes.io/projected/8c441055-8615-497e-8754-d107b3be24c7-kube-api-access-zsgq9\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.461604 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-config-data" (OuterVolumeSpecName: "config-data") pod "8c441055-8615-497e-8754-d107b3be24c7" (UID: "8c441055-8615-497e-8754-d107b3be24c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.472301 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8c441055-8615-497e-8754-d107b3be24c7" (UID: "8c441055-8615-497e-8754-d107b3be24c7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.481351 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8c441055-8615-497e-8754-d107b3be24c7" (UID: "8c441055-8615-497e-8754-d107b3be24c7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.548459 4804 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.548488 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.548496 4804 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.718274 4804 generic.go:334] "Generic (PLEG): container finished" podID="8c441055-8615-497e-8754-d107b3be24c7" containerID="a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c" exitCode=0 Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.718337 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.718329 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67b49bc6f6-6kg64" event={"ID":"8c441055-8615-497e-8754-d107b3be24c7","Type":"ContainerDied","Data":"a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c"} Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.718385 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67b49bc6f6-6kg64" event={"ID":"8c441055-8615-497e-8754-d107b3be24c7","Type":"ContainerDied","Data":"ba0cc2230bff6e65b06b28d38b4ed605a390c7cff5692c7c90bbd33cf0934ef3"} Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.718405 4804 scope.go:117] "RemoveContainer" containerID="a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.742782 4804 scope.go:117] "RemoveContainer" containerID="e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.763612 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-67b49bc6f6-6kg64"] Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.770916 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-67b49bc6f6-6kg64"] Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.775158 4804 scope.go:117] "RemoveContainer" containerID="a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c" Feb 17 13:47:43 crc kubenswrapper[4804]: E0217 13:47:43.775628 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c\": container with ID starting with a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c not found: ID does not exist" containerID="a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.775662 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c"} err="failed to get container status \"a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c\": rpc error: code = NotFound desc = could not find container \"a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c\": container with ID starting with a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c not found: ID does not exist" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.775684 4804 scope.go:117] "RemoveContainer" containerID="e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec" Feb 17 13:47:43 crc kubenswrapper[4804]: E0217 13:47:43.775918 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec\": container with ID starting with e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec not found: ID does not exist" containerID="e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.775942 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec"} err="failed to get container status \"e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec\": rpc error: code = NotFound desc = could not find container \"e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec\": container with ID starting with e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec not found: ID does not exist" Feb 17 13:47:44 crc kubenswrapper[4804]: I0217 13:47:44.584927 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c441055-8615-497e-8754-d107b3be24c7" path="/var/lib/kubelet/pods/8c441055-8615-497e-8754-d107b3be24c7/volumes" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.553821 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-59cfdfc65f-48l6n"] Feb 17 13:47:45 crc kubenswrapper[4804]: E0217 13:47:45.570925 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c441055-8615-497e-8754-d107b3be24c7" containerName="placement-log" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.570967 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c441055-8615-497e-8754-d107b3be24c7" containerName="placement-log" Feb 17 13:47:45 crc kubenswrapper[4804]: E0217 13:47:45.570983 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c441055-8615-497e-8754-d107b3be24c7" containerName="placement-api" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.570990 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c441055-8615-497e-8754-d107b3be24c7" containerName="placement-api" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.571192 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c441055-8615-497e-8754-d107b3be24c7" containerName="placement-api" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.571226 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c441055-8615-497e-8754-d107b3be24c7" containerName="placement-log" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.572185 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.575764 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.576891 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.577008 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.580297 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-59cfdfc65f-48l6n"] Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.691391 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlfbb\" (UniqueName: \"kubernetes.io/projected/be0372d3-4646-46e7-af04-6977a7426f35-kube-api-access-hlfbb\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.691581 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-internal-tls-certs\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.691714 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be0372d3-4646-46e7-af04-6977a7426f35-run-httpd\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.691769 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-public-tls-certs\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.691797 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be0372d3-4646-46e7-af04-6977a7426f35-log-httpd\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.691900 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-config-data\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.692362 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-combined-ca-bundle\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.692434 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be0372d3-4646-46e7-af04-6977a7426f35-etc-swift\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.794169 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-internal-tls-certs\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.794277 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be0372d3-4646-46e7-af04-6977a7426f35-run-httpd\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.794302 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-public-tls-certs\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.794319 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be0372d3-4646-46e7-af04-6977a7426f35-log-httpd\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.794361 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-config-data\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.794383 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-combined-ca-bundle\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.794402 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be0372d3-4646-46e7-af04-6977a7426f35-etc-swift\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.794437 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlfbb\" (UniqueName: \"kubernetes.io/projected/be0372d3-4646-46e7-af04-6977a7426f35-kube-api-access-hlfbb\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.796615 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be0372d3-4646-46e7-af04-6977a7426f35-run-httpd\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.799329 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be0372d3-4646-46e7-af04-6977a7426f35-log-httpd\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.801270 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-public-tls-certs\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.803360 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-internal-tls-certs\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.803505 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-combined-ca-bundle\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.808528 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-config-data\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.817112 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlfbb\" (UniqueName: \"kubernetes.io/projected/be0372d3-4646-46e7-af04-6977a7426f35-kube-api-access-hlfbb\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.817853 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be0372d3-4646-46e7-af04-6977a7426f35-etc-swift\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.869753 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.870027 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="ceilometer-central-agent" containerID="cri-o://1aabbda01fd4eb2f80fc3bbf09b7a922e4856861abbe2a6d98b91344740bf141" gracePeriod=30 Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.870048 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="proxy-httpd" containerID="cri-o://35ecd78bd9969e0d81d74d255802d84e8782104abf917998c0f5c9f7a1d3435a" gracePeriod=30 Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.870137 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="ceilometer-notification-agent" containerID="cri-o://092874b0b3e14392b931fcb3901d9071706161752bbb5877b56ec700010be97b" gracePeriod=30 Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.870096 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="sg-core" containerID="cri-o://99fe5ef4d5a27697bd3d835ca4e7242c9ea11c1b7e9ff93b4ae3d3d3447f90ca" gracePeriod=30 Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.889945 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.168:3000/\": EOF" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.893730 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:46 crc kubenswrapper[4804]: I0217 13:47:46.752871 4804 generic.go:334] "Generic (PLEG): container finished" podID="da1535f5-a225-489d-af6d-cbfa6042d239" containerID="35ecd78bd9969e0d81d74d255802d84e8782104abf917998c0f5c9f7a1d3435a" exitCode=0 Feb 17 13:47:46 crc kubenswrapper[4804]: I0217 13:47:46.753158 4804 generic.go:334] "Generic (PLEG): container finished" podID="da1535f5-a225-489d-af6d-cbfa6042d239" containerID="99fe5ef4d5a27697bd3d835ca4e7242c9ea11c1b7e9ff93b4ae3d3d3447f90ca" exitCode=2 Feb 17 13:47:46 crc kubenswrapper[4804]: I0217 13:47:46.753170 4804 generic.go:334] "Generic (PLEG): container finished" podID="da1535f5-a225-489d-af6d-cbfa6042d239" containerID="1aabbda01fd4eb2f80fc3bbf09b7a922e4856861abbe2a6d98b91344740bf141" exitCode=0 Feb 17 13:47:46 crc kubenswrapper[4804]: I0217 13:47:46.752964 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da1535f5-a225-489d-af6d-cbfa6042d239","Type":"ContainerDied","Data":"35ecd78bd9969e0d81d74d255802d84e8782104abf917998c0f5c9f7a1d3435a"} Feb 17 13:47:46 crc kubenswrapper[4804]: I0217 13:47:46.753235 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da1535f5-a225-489d-af6d-cbfa6042d239","Type":"ContainerDied","Data":"99fe5ef4d5a27697bd3d835ca4e7242c9ea11c1b7e9ff93b4ae3d3d3447f90ca"} Feb 17 13:47:46 crc kubenswrapper[4804]: I0217 13:47:46.753253 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da1535f5-a225-489d-af6d-cbfa6042d239","Type":"ContainerDied","Data":"1aabbda01fd4eb2f80fc3bbf09b7a922e4856861abbe2a6d98b91344740bf141"} Feb 17 13:47:48 crc kubenswrapper[4804]: I0217 13:47:48.344360 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 17 13:47:50 crc kubenswrapper[4804]: I0217 13:47:50.222676 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:47:50 crc kubenswrapper[4804]: I0217 13:47:50.224454 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="185b3c31-7ccc-4f8d-bcb1-20cabbf50943" containerName="glance-log" containerID="cri-o://953b973c887ab4f0021ac7303d1d54237d7d43957bc03fe8e6222019de4450e9" gracePeriod=30 Feb 17 13:47:50 crc kubenswrapper[4804]: I0217 13:47:50.224522 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="185b3c31-7ccc-4f8d-bcb1-20cabbf50943" containerName="glance-httpd" containerID="cri-o://b16be4415372b664ded381a846b14c2c2406261a1bf459398d286a236434a0be" gracePeriod=30 Feb 17 13:47:50 crc kubenswrapper[4804]: I0217 13:47:50.799053 4804 generic.go:334] "Generic (PLEG): container finished" podID="da1535f5-a225-489d-af6d-cbfa6042d239" containerID="092874b0b3e14392b931fcb3901d9071706161752bbb5877b56ec700010be97b" exitCode=0 Feb 17 13:47:50 crc kubenswrapper[4804]: I0217 13:47:50.799110 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da1535f5-a225-489d-af6d-cbfa6042d239","Type":"ContainerDied","Data":"092874b0b3e14392b931fcb3901d9071706161752bbb5877b56ec700010be97b"} Feb 17 13:47:50 crc kubenswrapper[4804]: I0217 13:47:50.801740 4804 generic.go:334] "Generic (PLEG): container finished" podID="185b3c31-7ccc-4f8d-bcb1-20cabbf50943" containerID="953b973c887ab4f0021ac7303d1d54237d7d43957bc03fe8e6222019de4450e9" exitCode=143 Feb 17 13:47:50 crc kubenswrapper[4804]: I0217 13:47:50.801783 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"185b3c31-7ccc-4f8d-bcb1-20cabbf50943","Type":"ContainerDied","Data":"953b973c887ab4f0021ac7303d1d54237d7d43957bc03fe8e6222019de4450e9"} Feb 17 13:47:51 crc kubenswrapper[4804]: I0217 13:47:51.246480 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:47:51 crc kubenswrapper[4804]: I0217 13:47:51.247022 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8ec519a7-9081-4341-ad6c-c81dda70bd3a" containerName="glance-log" containerID="cri-o://3c023d82e32da3e66e3f80b40ff960f9faffbfd6b13149e23d95974790def49f" gracePeriod=30 Feb 17 13:47:51 crc kubenswrapper[4804]: I0217 13:47:51.247502 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8ec519a7-9081-4341-ad6c-c81dda70bd3a" containerName="glance-httpd" containerID="cri-o://74e9c41b66fa02c3d94931a9817572fd799183a1707f607e072d3c3dddd9e96b" gracePeriod=30 Feb 17 13:47:51 crc kubenswrapper[4804]: I0217 13:47:51.813263 4804 generic.go:334] "Generic (PLEG): container finished" podID="8ec519a7-9081-4341-ad6c-c81dda70bd3a" containerID="3c023d82e32da3e66e3f80b40ff960f9faffbfd6b13149e23d95974790def49f" exitCode=143 Feb 17 13:47:51 crc kubenswrapper[4804]: I0217 13:47:51.813348 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ec519a7-9081-4341-ad6c-c81dda70bd3a","Type":"ContainerDied","Data":"3c023d82e32da3e66e3f80b40ff960f9faffbfd6b13149e23d95974790def49f"} Feb 17 13:47:51 crc kubenswrapper[4804]: I0217 13:47:51.861626 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-59cfdfc65f-48l6n"] Feb 17 13:47:51 crc kubenswrapper[4804]: W0217 13:47:51.905708 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe0372d3_4646_46e7_af04_6977a7426f35.slice/crio-01f02406beff5b8f1dc14a51f7818ca7fe7b3803580328438da33f9f2c184b0e WatchSource:0}: Error finding container 01f02406beff5b8f1dc14a51f7818ca7fe7b3803580328438da33f9f2c184b0e: Status 404 returned error can't find the container with id 01f02406beff5b8f1dc14a51f7818ca7fe7b3803580328438da33f9f2c184b0e Feb 17 13:47:51 crc kubenswrapper[4804]: I0217 13:47:51.942003 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.010737 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-scripts\") pod \"da1535f5-a225-489d-af6d-cbfa6042d239\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.012805 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljpm8\" (UniqueName: \"kubernetes.io/projected/da1535f5-a225-489d-af6d-cbfa6042d239-kube-api-access-ljpm8\") pod \"da1535f5-a225-489d-af6d-cbfa6042d239\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.013058 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-run-httpd\") pod \"da1535f5-a225-489d-af6d-cbfa6042d239\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.013474 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-log-httpd\") pod \"da1535f5-a225-489d-af6d-cbfa6042d239\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.014308 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-sg-core-conf-yaml\") pod \"da1535f5-a225-489d-af6d-cbfa6042d239\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.014489 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-combined-ca-bundle\") pod \"da1535f5-a225-489d-af6d-cbfa6042d239\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.014617 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-config-data\") pod \"da1535f5-a225-489d-af6d-cbfa6042d239\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.013632 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "da1535f5-a225-489d-af6d-cbfa6042d239" (UID: "da1535f5-a225-489d-af6d-cbfa6042d239"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.013737 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-scripts" (OuterVolumeSpecName: "scripts") pod "da1535f5-a225-489d-af6d-cbfa6042d239" (UID: "da1535f5-a225-489d-af6d-cbfa6042d239"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.014130 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "da1535f5-a225-489d-af6d-cbfa6042d239" (UID: "da1535f5-a225-489d-af6d-cbfa6042d239"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.016298 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da1535f5-a225-489d-af6d-cbfa6042d239-kube-api-access-ljpm8" (OuterVolumeSpecName: "kube-api-access-ljpm8") pod "da1535f5-a225-489d-af6d-cbfa6042d239" (UID: "da1535f5-a225-489d-af6d-cbfa6042d239"). InnerVolumeSpecName "kube-api-access-ljpm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.070430 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "da1535f5-a225-489d-af6d-cbfa6042d239" (UID: "da1535f5-a225-489d-af6d-cbfa6042d239"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.117430 4804 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.117458 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.117470 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljpm8\" (UniqueName: \"kubernetes.io/projected/da1535f5-a225-489d-af6d-cbfa6042d239-kube-api-access-ljpm8\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.117482 4804 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.117492 4804 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.158663 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da1535f5-a225-489d-af6d-cbfa6042d239" (UID: "da1535f5-a225-489d-af6d-cbfa6042d239"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.168505 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-config-data" (OuterVolumeSpecName: "config-data") pod "da1535f5-a225-489d-af6d-cbfa6042d239" (UID: "da1535f5-a225-489d-af6d-cbfa6042d239"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.219689 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.219730 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.821631 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"de1a53e3-68ce-4ecd-9c0a-80ffce568891","Type":"ContainerStarted","Data":"84f3b28dc2b3056e3fb5313ca3e48c8ca136793dc253000eb641f73f1de2f9a8"} Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.825099 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da1535f5-a225-489d-af6d-cbfa6042d239","Type":"ContainerDied","Data":"4b8e0eba24a3942ba5514c36c6f889a4738be910f9bdb4e385c1915be330d79c"} Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.825146 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.825216 4804 scope.go:117] "RemoveContainer" containerID="35ecd78bd9969e0d81d74d255802d84e8782104abf917998c0f5c9f7a1d3435a" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.827241 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-59cfdfc65f-48l6n" event={"ID":"be0372d3-4646-46e7-af04-6977a7426f35","Type":"ContainerStarted","Data":"9cbe66341df3b9590a351bd7c02ac11b961aa3da92b98d61d1d34240c4563e86"} Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.827274 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-59cfdfc65f-48l6n" event={"ID":"be0372d3-4646-46e7-af04-6977a7426f35","Type":"ContainerStarted","Data":"c8c0f66b6cbbe67c8aab35aeb07fc30545c1b5f65ed15816fa40b4ca13f37252"} Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.827288 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-59cfdfc65f-48l6n" event={"ID":"be0372d3-4646-46e7-af04-6977a7426f35","Type":"ContainerStarted","Data":"01f02406beff5b8f1dc14a51f7818ca7fe7b3803580328438da33f9f2c184b0e"} Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.827789 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.827882 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.845545 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.055751727 podStartE2EDuration="11.845521974s" podCreationTimestamp="2026-02-17 13:47:41 +0000 UTC" firstStartedPulling="2026-02-17 13:47:42.196969092 +0000 UTC m=+1336.308388439" lastFinishedPulling="2026-02-17 13:47:51.986739349 +0000 UTC m=+1346.098158686" observedRunningTime="2026-02-17 13:47:52.840608729 +0000 UTC m=+1346.952028066" watchObservedRunningTime="2026-02-17 13:47:52.845521974 +0000 UTC m=+1346.956941321" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.855174 4804 scope.go:117] "RemoveContainer" containerID="99fe5ef4d5a27697bd3d835ca4e7242c9ea11c1b7e9ff93b4ae3d3d3447f90ca" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.872898 4804 scope.go:117] "RemoveContainer" containerID="092874b0b3e14392b931fcb3901d9071706161752bbb5877b56ec700010be97b" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.883167 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-59cfdfc65f-48l6n" podStartSLOduration=7.883151647 podStartE2EDuration="7.883151647s" podCreationTimestamp="2026-02-17 13:47:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:52.875661602 +0000 UTC m=+1346.987080939" watchObservedRunningTime="2026-02-17 13:47:52.883151647 +0000 UTC m=+1346.994570984" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.904136 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.905446 4804 scope.go:117] "RemoveContainer" containerID="1aabbda01fd4eb2f80fc3bbf09b7a922e4856861abbe2a6d98b91344740bf141" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.920603 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.940748 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:52 crc kubenswrapper[4804]: E0217 13:47:52.941270 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="sg-core" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.941305 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="sg-core" Feb 17 13:47:52 crc kubenswrapper[4804]: E0217 13:47:52.941323 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="ceilometer-central-agent" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.941332 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="ceilometer-central-agent" Feb 17 13:47:52 crc kubenswrapper[4804]: E0217 13:47:52.941354 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="proxy-httpd" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.941361 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="proxy-httpd" Feb 17 13:47:52 crc kubenswrapper[4804]: E0217 13:47:52.941397 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="ceilometer-notification-agent" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.941403 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="ceilometer-notification-agent" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.941600 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="ceilometer-central-agent" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.941615 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="ceilometer-notification-agent" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.941624 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="sg-core" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.941645 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="proxy-httpd" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.947960 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.949313 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.956936 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.957544 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.034075 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-scripts\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.034368 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.034440 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.034489 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-config-data\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.034510 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-run-httpd\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.034546 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-log-httpd\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.034572 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr5gm\" (UniqueName: \"kubernetes.io/projected/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-kube-api-access-hr5gm\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.136004 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-config-data\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.136051 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-run-httpd\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.136099 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-log-httpd\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.136129 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr5gm\" (UniqueName: \"kubernetes.io/projected/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-kube-api-access-hr5gm\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.136186 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-scripts\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.136235 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.136299 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.137721 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-run-httpd\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.141695 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.141985 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-log-httpd\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.142945 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.143340 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-config-data\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.149703 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-scripts\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.164187 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr5gm\" (UniqueName: \"kubernetes.io/projected/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-kube-api-access-hr5gm\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.277382 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.567004 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-582lj"] Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.570066 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-582lj" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.590892 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-582lj"] Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.670255 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-570c-account-create-update-48hmw"] Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.674574 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1517f905-d980-43be-8583-f1a40170752e-operator-scripts\") pod \"nova-api-db-create-582lj\" (UID: \"1517f905-d980-43be-8583-f1a40170752e\") " pod="openstack/nova-api-db-create-582lj" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.674700 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjp5b\" (UniqueName: \"kubernetes.io/projected/1517f905-d980-43be-8583-f1a40170752e-kube-api-access-rjp5b\") pod \"nova-api-db-create-582lj\" (UID: \"1517f905-d980-43be-8583-f1a40170752e\") " pod="openstack/nova-api-db-create-582lj" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.675096 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-570c-account-create-update-48hmw" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.681548 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.704869 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-570c-account-create-update-48hmw"] Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.732510 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-6h6dp"] Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.734022 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6h6dp" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.752856 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-6h6dp"] Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.777833 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjp5b\" (UniqueName: \"kubernetes.io/projected/1517f905-d980-43be-8583-f1a40170752e-kube-api-access-rjp5b\") pod \"nova-api-db-create-582lj\" (UID: \"1517f905-d980-43be-8583-f1a40170752e\") " pod="openstack/nova-api-db-create-582lj" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.778130 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1517f905-d980-43be-8583-f1a40170752e-operator-scripts\") pod \"nova-api-db-create-582lj\" (UID: \"1517f905-d980-43be-8583-f1a40170752e\") " pod="openstack/nova-api-db-create-582lj" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.778266 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccb316de-cd6e-4f79-9387-81f7a8add771-operator-scripts\") pod \"nova-api-570c-account-create-update-48hmw\" (UID: \"ccb316de-cd6e-4f79-9387-81f7a8add771\") " pod="openstack/nova-api-570c-account-create-update-48hmw" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.778446 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2rgm\" (UniqueName: \"kubernetes.io/projected/ccb316de-cd6e-4f79-9387-81f7a8add771-kube-api-access-p2rgm\") pod \"nova-api-570c-account-create-update-48hmw\" (UID: \"ccb316de-cd6e-4f79-9387-81f7a8add771\") " pod="openstack/nova-api-570c-account-create-update-48hmw" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.779702 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1517f905-d980-43be-8583-f1a40170752e-operator-scripts\") pod \"nova-api-db-create-582lj\" (UID: \"1517f905-d980-43be-8583-f1a40170752e\") " pod="openstack/nova-api-db-create-582lj" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.803067 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-nn6tq"] Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.804546 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nn6tq" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.819469 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nn6tq"] Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.820064 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjp5b\" (UniqueName: \"kubernetes.io/projected/1517f905-d980-43be-8583-f1a40170752e-kube-api-access-rjp5b\") pod \"nova-api-db-create-582lj\" (UID: \"1517f905-d980-43be-8583-f1a40170752e\") " pod="openstack/nova-api-db-create-582lj" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.868267 4804 generic.go:334] "Generic (PLEG): container finished" podID="185b3c31-7ccc-4f8d-bcb1-20cabbf50943" containerID="b16be4415372b664ded381a846b14c2c2406261a1bf459398d286a236434a0be" exitCode=0 Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.868359 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"185b3c31-7ccc-4f8d-bcb1-20cabbf50943","Type":"ContainerDied","Data":"b16be4415372b664ded381a846b14c2c2406261a1bf459398d286a236434a0be"} Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.879556 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2rgm\" (UniqueName: \"kubernetes.io/projected/ccb316de-cd6e-4f79-9387-81f7a8add771-kube-api-access-p2rgm\") pod \"nova-api-570c-account-create-update-48hmw\" (UID: \"ccb316de-cd6e-4f79-9387-81f7a8add771\") " pod="openstack/nova-api-570c-account-create-update-48hmw" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.879616 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3c65a30-a890-4d85-80ca-93f9420d5aa4-operator-scripts\") pod \"nova-cell0-db-create-6h6dp\" (UID: \"f3c65a30-a890-4d85-80ca-93f9420d5aa4\") " pod="openstack/nova-cell0-db-create-6h6dp" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.879656 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mqt5\" (UniqueName: \"kubernetes.io/projected/f3c65a30-a890-4d85-80ca-93f9420d5aa4-kube-api-access-6mqt5\") pod \"nova-cell0-db-create-6h6dp\" (UID: \"f3c65a30-a890-4d85-80ca-93f9420d5aa4\") " pod="openstack/nova-cell0-db-create-6h6dp" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.879807 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccb316de-cd6e-4f79-9387-81f7a8add771-operator-scripts\") pod \"nova-api-570c-account-create-update-48hmw\" (UID: \"ccb316de-cd6e-4f79-9387-81f7a8add771\") " pod="openstack/nova-api-570c-account-create-update-48hmw" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.884700 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-2eb5-account-create-update-xv5m7"] Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.886583 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.888182 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccb316de-cd6e-4f79-9387-81f7a8add771-operator-scripts\") pod \"nova-api-570c-account-create-update-48hmw\" (UID: \"ccb316de-cd6e-4f79-9387-81f7a8add771\") " pod="openstack/nova-api-570c-account-create-update-48hmw" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.889554 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2eb5-account-create-update-xv5m7"] Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.889848 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.914929 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2rgm\" (UniqueName: \"kubernetes.io/projected/ccb316de-cd6e-4f79-9387-81f7a8add771-kube-api-access-p2rgm\") pod \"nova-api-570c-account-create-update-48hmw\" (UID: \"ccb316de-cd6e-4f79-9387-81f7a8add771\") " pod="openstack/nova-api-570c-account-create-update-48hmw" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.936663 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-582lj" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.970026 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:53.988284 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzbw2\" (UniqueName: \"kubernetes.io/projected/5fa81aac-8f7a-4947-9fbe-c38851b3652e-kube-api-access-zzbw2\") pod \"nova-cell1-db-create-nn6tq\" (UID: \"5fa81aac-8f7a-4947-9fbe-c38851b3652e\") " pod="openstack/nova-cell1-db-create-nn6tq" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:53.988371 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d23eb85-73ab-4049-b6be-486640c922e0-operator-scripts\") pod \"nova-cell0-2eb5-account-create-update-xv5m7\" (UID: \"3d23eb85-73ab-4049-b6be-486640c922e0\") " pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:53.991993 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fa81aac-8f7a-4947-9fbe-c38851b3652e-operator-scripts\") pod \"nova-cell1-db-create-nn6tq\" (UID: \"5fa81aac-8f7a-4947-9fbe-c38851b3652e\") " pod="openstack/nova-cell1-db-create-nn6tq" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:53.992160 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3c65a30-a890-4d85-80ca-93f9420d5aa4-operator-scripts\") pod \"nova-cell0-db-create-6h6dp\" (UID: \"f3c65a30-a890-4d85-80ca-93f9420d5aa4\") " pod="openstack/nova-cell0-db-create-6h6dp" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:53.992234 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mqt5\" (UniqueName: \"kubernetes.io/projected/f3c65a30-a890-4d85-80ca-93f9420d5aa4-kube-api-access-6mqt5\") pod \"nova-cell0-db-create-6h6dp\" (UID: \"f3c65a30-a890-4d85-80ca-93f9420d5aa4\") " pod="openstack/nova-cell0-db-create-6h6dp" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:53.992304 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt872\" (UniqueName: \"kubernetes.io/projected/3d23eb85-73ab-4049-b6be-486640c922e0-kube-api-access-qt872\") pod \"nova-cell0-2eb5-account-create-update-xv5m7\" (UID: \"3d23eb85-73ab-4049-b6be-486640c922e0\") " pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:53.993789 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3c65a30-a890-4d85-80ca-93f9420d5aa4-operator-scripts\") pod \"nova-cell0-db-create-6h6dp\" (UID: \"f3c65a30-a890-4d85-80ca-93f9420d5aa4\") " pod="openstack/nova-cell0-db-create-6h6dp" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.007659 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-570c-account-create-update-48hmw" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.008825 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mqt5\" (UniqueName: \"kubernetes.io/projected/f3c65a30-a890-4d85-80ca-93f9420d5aa4-kube-api-access-6mqt5\") pod \"nova-cell0-db-create-6h6dp\" (UID: \"f3c65a30-a890-4d85-80ca-93f9420d5aa4\") " pod="openstack/nova-cell0-db-create-6h6dp" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.067207 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-6388-account-create-update-skdjv"] Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.068171 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6h6dp" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.069332 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6388-account-create-update-skdjv" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.083629 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.090578 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6388-account-create-update-skdjv"] Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.095929 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fa81aac-8f7a-4947-9fbe-c38851b3652e-operator-scripts\") pod \"nova-cell1-db-create-nn6tq\" (UID: \"5fa81aac-8f7a-4947-9fbe-c38851b3652e\") " pod="openstack/nova-cell1-db-create-nn6tq" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.096066 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt872\" (UniqueName: \"kubernetes.io/projected/3d23eb85-73ab-4049-b6be-486640c922e0-kube-api-access-qt872\") pod \"nova-cell0-2eb5-account-create-update-xv5m7\" (UID: \"3d23eb85-73ab-4049-b6be-486640c922e0\") " pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.096245 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzbw2\" (UniqueName: \"kubernetes.io/projected/5fa81aac-8f7a-4947-9fbe-c38851b3652e-kube-api-access-zzbw2\") pod \"nova-cell1-db-create-nn6tq\" (UID: \"5fa81aac-8f7a-4947-9fbe-c38851b3652e\") " pod="openstack/nova-cell1-db-create-nn6tq" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.096288 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d23eb85-73ab-4049-b6be-486640c922e0-operator-scripts\") pod \"nova-cell0-2eb5-account-create-update-xv5m7\" (UID: \"3d23eb85-73ab-4049-b6be-486640c922e0\") " pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.097082 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d23eb85-73ab-4049-b6be-486640c922e0-operator-scripts\") pod \"nova-cell0-2eb5-account-create-update-xv5m7\" (UID: \"3d23eb85-73ab-4049-b6be-486640c922e0\") " pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.097597 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fa81aac-8f7a-4947-9fbe-c38851b3652e-operator-scripts\") pod \"nova-cell1-db-create-nn6tq\" (UID: \"5fa81aac-8f7a-4947-9fbe-c38851b3652e\") " pod="openstack/nova-cell1-db-create-nn6tq" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.117542 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt872\" (UniqueName: \"kubernetes.io/projected/3d23eb85-73ab-4049-b6be-486640c922e0-kube-api-access-qt872\") pod \"nova-cell0-2eb5-account-create-update-xv5m7\" (UID: \"3d23eb85-73ab-4049-b6be-486640c922e0\") " pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.118925 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzbw2\" (UniqueName: \"kubernetes.io/projected/5fa81aac-8f7a-4947-9fbe-c38851b3652e-kube-api-access-zzbw2\") pod \"nova-cell1-db-create-nn6tq\" (UID: \"5fa81aac-8f7a-4947-9fbe-c38851b3652e\") " pod="openstack/nova-cell1-db-create-nn6tq" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.166077 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.194736 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nn6tq" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.197889 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjrg7\" (UniqueName: \"kubernetes.io/projected/92d9081e-1e94-4244-b66a-34b05bc98f2d-kube-api-access-mjrg7\") pod \"nova-cell1-6388-account-create-update-skdjv\" (UID: \"92d9081e-1e94-4244-b66a-34b05bc98f2d\") " pod="openstack/nova-cell1-6388-account-create-update-skdjv" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.198030 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92d9081e-1e94-4244-b66a-34b05bc98f2d-operator-scripts\") pod \"nova-cell1-6388-account-create-update-skdjv\" (UID: \"92d9081e-1e94-4244-b66a-34b05bc98f2d\") " pod="openstack/nova-cell1-6388-account-create-update-skdjv" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.282297 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.299496 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-config-data\") pod \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.300037 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-httpd-run\") pod \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.300224 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-combined-ca-bundle\") pod \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.300262 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-scripts\") pod \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.300328 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.300348 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-logs\") pod \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.300393 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-public-tls-certs\") pod \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.300473 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9djck\" (UniqueName: \"kubernetes.io/projected/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-kube-api-access-9djck\") pod \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.300729 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "185b3c31-7ccc-4f8d-bcb1-20cabbf50943" (UID: "185b3c31-7ccc-4f8d-bcb1-20cabbf50943"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.300915 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92d9081e-1e94-4244-b66a-34b05bc98f2d-operator-scripts\") pod \"nova-cell1-6388-account-create-update-skdjv\" (UID: \"92d9081e-1e94-4244-b66a-34b05bc98f2d\") " pod="openstack/nova-cell1-6388-account-create-update-skdjv" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.301042 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjrg7\" (UniqueName: \"kubernetes.io/projected/92d9081e-1e94-4244-b66a-34b05bc98f2d-kube-api-access-mjrg7\") pod \"nova-cell1-6388-account-create-update-skdjv\" (UID: \"92d9081e-1e94-4244-b66a-34b05bc98f2d\") " pod="openstack/nova-cell1-6388-account-create-update-skdjv" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.301206 4804 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.302834 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92d9081e-1e94-4244-b66a-34b05bc98f2d-operator-scripts\") pod \"nova-cell1-6388-account-create-update-skdjv\" (UID: \"92d9081e-1e94-4244-b66a-34b05bc98f2d\") " pod="openstack/nova-cell1-6388-account-create-update-skdjv" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.304670 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-logs" (OuterVolumeSpecName: "logs") pod "185b3c31-7ccc-4f8d-bcb1-20cabbf50943" (UID: "185b3c31-7ccc-4f8d-bcb1-20cabbf50943"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.307263 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-scripts" (OuterVolumeSpecName: "scripts") pod "185b3c31-7ccc-4f8d-bcb1-20cabbf50943" (UID: "185b3c31-7ccc-4f8d-bcb1-20cabbf50943"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.315879 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "185b3c31-7ccc-4f8d-bcb1-20cabbf50943" (UID: "185b3c31-7ccc-4f8d-bcb1-20cabbf50943"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.319997 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-kube-api-access-9djck" (OuterVolumeSpecName: "kube-api-access-9djck") pod "185b3c31-7ccc-4f8d-bcb1-20cabbf50943" (UID: "185b3c31-7ccc-4f8d-bcb1-20cabbf50943"). InnerVolumeSpecName "kube-api-access-9djck". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.324191 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjrg7\" (UniqueName: \"kubernetes.io/projected/92d9081e-1e94-4244-b66a-34b05bc98f2d-kube-api-access-mjrg7\") pod \"nova-cell1-6388-account-create-update-skdjv\" (UID: \"92d9081e-1e94-4244-b66a-34b05bc98f2d\") " pod="openstack/nova-cell1-6388-account-create-update-skdjv" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.363152 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "185b3c31-7ccc-4f8d-bcb1-20cabbf50943" (UID: "185b3c31-7ccc-4f8d-bcb1-20cabbf50943"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.400670 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "185b3c31-7ccc-4f8d-bcb1-20cabbf50943" (UID: "185b3c31-7ccc-4f8d-bcb1-20cabbf50943"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.403183 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9djck\" (UniqueName: \"kubernetes.io/projected/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-kube-api-access-9djck\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.403256 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.403269 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.403278 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.403303 4804 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.403313 4804 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.419760 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6388-account-create-update-skdjv" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.420946 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-config-data" (OuterVolumeSpecName: "config-data") pod "185b3c31-7ccc-4f8d-bcb1-20cabbf50943" (UID: "185b3c31-7ccc-4f8d-bcb1-20cabbf50943"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.438382 4804 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.507247 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.507498 4804 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.599387 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" path="/var/lib/kubelet/pods/da1535f5-a225-489d-af6d-cbfa6042d239/volumes" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.600436 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-582lj"] Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.669783 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.764447 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nn6tq"] Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.779377 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-570c-account-create-update-48hmw"] Feb 17 13:47:54 crc kubenswrapper[4804]: W0217 13:47:54.790376 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccb316de_cd6e_4f79_9387_81f7a8add771.slice/crio-199b20de65a928c87690cfbe6b6a25c5d3467e26b72add5ce1fab6172ff92b03 WatchSource:0}: Error finding container 199b20de65a928c87690cfbe6b6a25c5d3467e26b72add5ce1fab6172ff92b03: Status 404 returned error can't find the container with id 199b20de65a928c87690cfbe6b6a25c5d3467e26b72add5ce1fab6172ff92b03 Feb 17 13:47:54 crc kubenswrapper[4804]: W0217 13:47:54.790628 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fa81aac_8f7a_4947_9fbe_c38851b3652e.slice/crio-b39e64891804c87744085d0038f167adab37c91526911b1d675fc165390c9ae6 WatchSource:0}: Error finding container b39e64891804c87744085d0038f167adab37c91526911b1d675fc165390c9ae6: Status 404 returned error can't find the container with id b39e64891804c87744085d0038f167adab37c91526911b1d675fc165390c9ae6 Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.794668 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-6h6dp"] Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.931162 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-582lj" event={"ID":"1517f905-d980-43be-8583-f1a40170752e","Type":"ContainerStarted","Data":"76d722774285224a6de60017eb8318c4877ef97f9d26d58e45fd8422945c25d0"} Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.931221 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-582lj" event={"ID":"1517f905-d980-43be-8583-f1a40170752e","Type":"ContainerStarted","Data":"a5ffec33f37d9af4010f0839f331e730d0f7b8e15e52fe90f544650f10da490a"} Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.946004 4804 generic.go:334] "Generic (PLEG): container finished" podID="8ec519a7-9081-4341-ad6c-c81dda70bd3a" containerID="74e9c41b66fa02c3d94931a9817572fd799183a1707f607e072d3c3dddd9e96b" exitCode=0 Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.946089 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ec519a7-9081-4341-ad6c-c81dda70bd3a","Type":"ContainerDied","Data":"74e9c41b66fa02c3d94931a9817572fd799183a1707f607e072d3c3dddd9e96b"} Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.951943 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"185b3c31-7ccc-4f8d-bcb1-20cabbf50943","Type":"ContainerDied","Data":"eefbbd3f0b520bf32a3f3135f04b4227a82b1cef683b398cd1cff8682da24dc5"} Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.951996 4804 scope.go:117] "RemoveContainer" containerID="b16be4415372b664ded381a846b14c2c2406261a1bf459398d286a236434a0be" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.952004 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.954803 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nn6tq" event={"ID":"5fa81aac-8f7a-4947-9fbe-c38851b3652e","Type":"ContainerStarted","Data":"b39e64891804c87744085d0038f167adab37c91526911b1d675fc165390c9ae6"} Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.960500 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6h6dp" event={"ID":"f3c65a30-a890-4d85-80ca-93f9420d5aa4","Type":"ContainerStarted","Data":"7fc6124a6d90d9e051ea54c2356413344027c38ef2bbf16638c22f2aa3317a37"} Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.974778 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc","Type":"ContainerStarted","Data":"b3be3c859965ba94fbed2ed95fd8afc727f019eeb34bd5903d88d5dbf4d77e51"} Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.982486 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.982801 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-570c-account-create-update-48hmw" event={"ID":"ccb316de-cd6e-4f79-9387-81f7a8add771","Type":"ContainerStarted","Data":"199b20de65a928c87690cfbe6b6a25c5d3467e26b72add5ce1fab6172ff92b03"} Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.993929 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.003553 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6388-account-create-update-skdjv"] Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.014289 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:47:55 crc kubenswrapper[4804]: E0217 13:47:55.014819 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="185b3c31-7ccc-4f8d-bcb1-20cabbf50943" containerName="glance-log" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.014840 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="185b3c31-7ccc-4f8d-bcb1-20cabbf50943" containerName="glance-log" Feb 17 13:47:55 crc kubenswrapper[4804]: E0217 13:47:55.014874 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="185b3c31-7ccc-4f8d-bcb1-20cabbf50943" containerName="glance-httpd" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.014880 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="185b3c31-7ccc-4f8d-bcb1-20cabbf50943" containerName="glance-httpd" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.015053 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="185b3c31-7ccc-4f8d-bcb1-20cabbf50943" containerName="glance-httpd" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.015074 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="185b3c31-7ccc-4f8d-bcb1-20cabbf50943" containerName="glance-log" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.016147 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.019888 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2eb5-account-create-update-xv5m7"] Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.021543 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.027558 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.048828 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.071940 4804 scope.go:117] "RemoveContainer" containerID="953b973c887ab4f0021ac7303d1d54237d7d43957bc03fe8e6222019de4450e9" Feb 17 13:47:55 crc kubenswrapper[4804]: W0217 13:47:55.109393 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92d9081e_1e94_4244_b66a_34b05bc98f2d.slice/crio-0430e4597f7a0ce9032791766bb4dd9708a3c867ce5a3db25a833e2a1bde6abd WatchSource:0}: Error finding container 0430e4597f7a0ce9032791766bb4dd9708a3c867ce5a3db25a833e2a1bde6abd: Status 404 returned error can't find the container with id 0430e4597f7a0ce9032791766bb4dd9708a3c867ce5a3db25a833e2a1bde6abd Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.137569 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.137642 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc2e7136-825b-4608-a106-944f359c7369-logs\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.137689 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-scripts\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.137707 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.137738 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.137765 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79qkq\" (UniqueName: \"kubernetes.io/projected/cc2e7136-825b-4608-a106-944f359c7369-kube-api-access-79qkq\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.137779 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-config-data\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.137824 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc2e7136-825b-4608-a106-944f359c7369-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.238991 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc2e7136-825b-4608-a106-944f359c7369-logs\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.239061 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-scripts\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.239080 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.239111 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.239139 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79qkq\" (UniqueName: \"kubernetes.io/projected/cc2e7136-825b-4608-a106-944f359c7369-kube-api-access-79qkq\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.239157 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-config-data\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.239254 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc2e7136-825b-4608-a106-944f359c7369-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.239297 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.239665 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.248856 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc2e7136-825b-4608-a106-944f359c7369-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.249084 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc2e7136-825b-4608-a106-944f359c7369-logs\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.254957 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.254988 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-scripts\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.259820 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-config-data\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.260581 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.264262 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79qkq\" (UniqueName: \"kubernetes.io/projected/cc2e7136-825b-4608-a106-944f359c7369-kube-api-access-79qkq\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.273962 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.392753 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.565474 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.656787 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.656885 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-combined-ca-bundle\") pod \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.656922 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scvw2\" (UniqueName: \"kubernetes.io/projected/8ec519a7-9081-4341-ad6c-c81dda70bd3a-kube-api-access-scvw2\") pod \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.656976 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-config-data\") pod \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.657063 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-scripts\") pod \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.657091 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-internal-tls-certs\") pod \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.657130 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-httpd-run\") pod \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.657162 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-logs\") pod \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.659148 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8ec519a7-9081-4341-ad6c-c81dda70bd3a" (UID: "8ec519a7-9081-4341-ad6c-c81dda70bd3a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.660565 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-logs" (OuterVolumeSpecName: "logs") pod "8ec519a7-9081-4341-ad6c-c81dda70bd3a" (UID: "8ec519a7-9081-4341-ad6c-c81dda70bd3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.688851 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "8ec519a7-9081-4341-ad6c-c81dda70bd3a" (UID: "8ec519a7-9081-4341-ad6c-c81dda70bd3a"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.691533 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-scripts" (OuterVolumeSpecName: "scripts") pod "8ec519a7-9081-4341-ad6c-c81dda70bd3a" (UID: "8ec519a7-9081-4341-ad6c-c81dda70bd3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.700559 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ec519a7-9081-4341-ad6c-c81dda70bd3a-kube-api-access-scvw2" (OuterVolumeSpecName: "kube-api-access-scvw2") pod "8ec519a7-9081-4341-ad6c-c81dda70bd3a" (UID: "8ec519a7-9081-4341-ad6c-c81dda70bd3a"). InnerVolumeSpecName "kube-api-access-scvw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.756466 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-config-data" (OuterVolumeSpecName: "config-data") pod "8ec519a7-9081-4341-ad6c-c81dda70bd3a" (UID: "8ec519a7-9081-4341-ad6c-c81dda70bd3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.758631 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ec519a7-9081-4341-ad6c-c81dda70bd3a" (UID: "8ec519a7-9081-4341-ad6c-c81dda70bd3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.759423 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.759450 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scvw2\" (UniqueName: \"kubernetes.io/projected/8ec519a7-9081-4341-ad6c-c81dda70bd3a-kube-api-access-scvw2\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.759461 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.759470 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.759478 4804 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.759490 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.759508 4804 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.818612 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8ec519a7-9081-4341-ad6c-c81dda70bd3a" (UID: "8ec519a7-9081-4341-ad6c-c81dda70bd3a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.822067 4804 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.861746 4804 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.861793 4804 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.994177 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ec519a7-9081-4341-ad6c-c81dda70bd3a","Type":"ContainerDied","Data":"ba1329d4b79ba312c3f527f0d612e3cd76cb2acbaa7d0c300741c813abd79d36"} Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.994243 4804 scope.go:117] "RemoveContainer" containerID="74e9c41b66fa02c3d94931a9817572fd799183a1707f607e072d3c3dddd9e96b" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.994375 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.999734 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6388-account-create-update-skdjv" event={"ID":"92d9081e-1e94-4244-b66a-34b05bc98f2d","Type":"ContainerStarted","Data":"0430e4597f7a0ce9032791766bb4dd9708a3c867ce5a3db25a833e2a1bde6abd"} Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.000873 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" event={"ID":"3d23eb85-73ab-4049-b6be-486640c922e0","Type":"ContainerStarted","Data":"c524172161ffac83a0b6e7a5805c119f237374e27cb6f6b470e9d29ed3840c55"} Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.005161 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-570c-account-create-update-48hmw" event={"ID":"ccb316de-cd6e-4f79-9387-81f7a8add771","Type":"ContainerStarted","Data":"feb7469a90aaa528b89392a82772cfa0640653aa5ae69effdca1ed55e8c2a1de"} Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.038939 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-582lj" podStartSLOduration=3.038917464 podStartE2EDuration="3.038917464s" podCreationTimestamp="2026-02-17 13:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:56.024313285 +0000 UTC m=+1350.135732632" watchObservedRunningTime="2026-02-17 13:47:56.038917464 +0000 UTC m=+1350.150336801" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.056538 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.078294 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.094561 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:47:56 crc kubenswrapper[4804]: E0217 13:47:56.094997 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ec519a7-9081-4341-ad6c-c81dda70bd3a" containerName="glance-log" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.095022 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ec519a7-9081-4341-ad6c-c81dda70bd3a" containerName="glance-log" Feb 17 13:47:56 crc kubenswrapper[4804]: E0217 13:47:56.095073 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ec519a7-9081-4341-ad6c-c81dda70bd3a" containerName="glance-httpd" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.095082 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ec519a7-9081-4341-ad6c-c81dda70bd3a" containerName="glance-httpd" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.095317 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ec519a7-9081-4341-ad6c-c81dda70bd3a" containerName="glance-log" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.095342 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ec519a7-9081-4341-ad6c-c81dda70bd3a" containerName="glance-httpd" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.096392 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.101703 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.101912 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.102923 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.173635 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.173716 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2rcg\" (UniqueName: \"kubernetes.io/projected/52f268a5-3c72-4655-bb36-823c34e5312d-kube-api-access-b2rcg\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.173806 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52f268a5-3c72-4655-bb36-823c34e5312d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.173852 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.173913 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.173982 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.174032 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52f268a5-3c72-4655-bb36-823c34e5312d-logs\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.174077 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.275713 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.275851 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.275884 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52f268a5-3c72-4655-bb36-823c34e5312d-logs\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.275923 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.275976 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.276026 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2rcg\" (UniqueName: \"kubernetes.io/projected/52f268a5-3c72-4655-bb36-823c34e5312d-kube-api-access-b2rcg\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.276101 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52f268a5-3c72-4655-bb36-823c34e5312d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.276140 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.276272 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.276530 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52f268a5-3c72-4655-bb36-823c34e5312d-logs\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.277110 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52f268a5-3c72-4655-bb36-823c34e5312d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.280754 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.281078 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.281537 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.281569 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.295463 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2rcg\" (UniqueName: \"kubernetes.io/projected/52f268a5-3c72-4655-bb36-823c34e5312d-kube-api-access-b2rcg\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.311328 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.415316 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.482429 4804 scope.go:117] "RemoveContainer" containerID="3c023d82e32da3e66e3f80b40ff960f9faffbfd6b13149e23d95974790def49f" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.639446 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="185b3c31-7ccc-4f8d-bcb1-20cabbf50943" path="/var/lib/kubelet/pods/185b3c31-7ccc-4f8d-bcb1-20cabbf50943/volumes" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.640329 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ec519a7-9081-4341-ad6c-c81dda70bd3a" path="/var/lib/kubelet/pods/8ec519a7-9081-4341-ad6c-c81dda70bd3a/volumes" Feb 17 13:47:57 crc kubenswrapper[4804]: I0217 13:47:57.014790 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6h6dp" event={"ID":"f3c65a30-a890-4d85-80ca-93f9420d5aa4","Type":"ContainerStarted","Data":"75003012d3c522e6a637465c31ac382126c2c3ac2eb1897adb68193823f330ce"} Feb 17 13:47:57 crc kubenswrapper[4804]: I0217 13:47:57.018285 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nn6tq" event={"ID":"5fa81aac-8f7a-4947-9fbe-c38851b3652e","Type":"ContainerStarted","Data":"62fe2cdf4668625c3cfd915d4fddf1e341b2d4a545fb2af5d424708a57a7a4a3"} Feb 17 13:47:57 crc kubenswrapper[4804]: I0217 13:47:57.041022 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-6h6dp" podStartSLOduration=4.040997403 podStartE2EDuration="4.040997403s" podCreationTimestamp="2026-02-17 13:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:57.032139324 +0000 UTC m=+1351.143558671" watchObservedRunningTime="2026-02-17 13:47:57.040997403 +0000 UTC m=+1351.152416750" Feb 17 13:47:57 crc kubenswrapper[4804]: I0217 13:47:57.051517 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-nn6tq" podStartSLOduration=4.051493893 podStartE2EDuration="4.051493893s" podCreationTimestamp="2026-02-17 13:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:57.046651991 +0000 UTC m=+1351.158071328" watchObservedRunningTime="2026-02-17 13:47:57.051493893 +0000 UTC m=+1351.162913230" Feb 17 13:47:57 crc kubenswrapper[4804]: I0217 13:47:57.064126 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-570c-account-create-update-48hmw" podStartSLOduration=4.06410912 podStartE2EDuration="4.06410912s" podCreationTimestamp="2026-02-17 13:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:57.061531309 +0000 UTC m=+1351.172950646" watchObservedRunningTime="2026-02-17 13:47:57.06410912 +0000 UTC m=+1351.175528457" Feb 17 13:47:57 crc kubenswrapper[4804]: I0217 13:47:57.127616 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:47:57 crc kubenswrapper[4804]: I0217 13:47:57.265792 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:47:57 crc kubenswrapper[4804]: W0217 13:47:57.270672 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52f268a5_3c72_4655_bb36_823c34e5312d.slice/crio-d3346765f7a3bbcc535416a496460719664ec3499f65fff05932d896963ca0ab WatchSource:0}: Error finding container d3346765f7a3bbcc535416a496460719664ec3499f65fff05932d896963ca0ab: Status 404 returned error can't find the container with id d3346765f7a3bbcc535416a496460719664ec3499f65fff05932d896963ca0ab Feb 17 13:47:57 crc kubenswrapper[4804]: I0217 13:47:57.700753 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:57 crc kubenswrapper[4804]: I0217 13:47:57.775243 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-547f989fd6-rqkvc"] Feb 17 13:47:57 crc kubenswrapper[4804]: I0217 13:47:57.775884 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-547f989fd6-rqkvc" podUID="a2f2352e-7e9b-439f-be3c-b48b70681658" containerName="neutron-api" containerID="cri-o://8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef" gracePeriod=30 Feb 17 13:47:57 crc kubenswrapper[4804]: I0217 13:47:57.776192 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-547f989fd6-rqkvc" podUID="a2f2352e-7e9b-439f-be3c-b48b70681658" containerName="neutron-httpd" containerID="cri-o://60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad" gracePeriod=30 Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.049463 4804 generic.go:334] "Generic (PLEG): container finished" podID="5fa81aac-8f7a-4947-9fbe-c38851b3652e" containerID="62fe2cdf4668625c3cfd915d4fddf1e341b2d4a545fb2af5d424708a57a7a4a3" exitCode=0 Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.049549 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nn6tq" event={"ID":"5fa81aac-8f7a-4947-9fbe-c38851b3652e","Type":"ContainerDied","Data":"62fe2cdf4668625c3cfd915d4fddf1e341b2d4a545fb2af5d424708a57a7a4a3"} Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.059429 4804 generic.go:334] "Generic (PLEG): container finished" podID="92d9081e-1e94-4244-b66a-34b05bc98f2d" containerID="cd29054fcbff23437aedab7f24e705fc390169a8546254413b976c34b8bd4901" exitCode=0 Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.059530 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6388-account-create-update-skdjv" event={"ID":"92d9081e-1e94-4244-b66a-34b05bc98f2d","Type":"ContainerDied","Data":"cd29054fcbff23437aedab7f24e705fc390169a8546254413b976c34b8bd4901"} Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.071144 4804 generic.go:334] "Generic (PLEG): container finished" podID="ccb316de-cd6e-4f79-9387-81f7a8add771" containerID="feb7469a90aaa528b89392a82772cfa0640653aa5ae69effdca1ed55e8c2a1de" exitCode=0 Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.071314 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-570c-account-create-update-48hmw" event={"ID":"ccb316de-cd6e-4f79-9387-81f7a8add771","Type":"ContainerDied","Data":"feb7469a90aaa528b89392a82772cfa0640653aa5ae69effdca1ed55e8c2a1de"} Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.078359 4804 generic.go:334] "Generic (PLEG): container finished" podID="a2f2352e-7e9b-439f-be3c-b48b70681658" containerID="60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad" exitCode=0 Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.078427 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547f989fd6-rqkvc" event={"ID":"a2f2352e-7e9b-439f-be3c-b48b70681658","Type":"ContainerDied","Data":"60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad"} Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.085510 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cc2e7136-825b-4608-a106-944f359c7369","Type":"ContainerStarted","Data":"93ebf4cecf403069ba9d9266b5376f8b2ebb0ce31269d457e86f49f2965016a6"} Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.085565 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cc2e7136-825b-4608-a106-944f359c7369","Type":"ContainerStarted","Data":"59c1c254275218fcd0dec8d899a862c6d1bd00eea43efe6566e036ac8b535b56"} Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.089440 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"52f268a5-3c72-4655-bb36-823c34e5312d","Type":"ContainerStarted","Data":"d3346765f7a3bbcc535416a496460719664ec3499f65fff05932d896963ca0ab"} Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.098001 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc","Type":"ContainerStarted","Data":"3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256"} Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.111103 4804 generic.go:334] "Generic (PLEG): container finished" podID="3d23eb85-73ab-4049-b6be-486640c922e0" containerID="04848e079d7c3dd5aec9613ff12ec81fb185688c9c0af0d2f63039d17f192069" exitCode=0 Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.111177 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" event={"ID":"3d23eb85-73ab-4049-b6be-486640c922e0","Type":"ContainerDied","Data":"04848e079d7c3dd5aec9613ff12ec81fb185688c9c0af0d2f63039d17f192069"} Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.115063 4804 generic.go:334] "Generic (PLEG): container finished" podID="1517f905-d980-43be-8583-f1a40170752e" containerID="76d722774285224a6de60017eb8318c4877ef97f9d26d58e45fd8422945c25d0" exitCode=0 Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.115146 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-582lj" event={"ID":"1517f905-d980-43be-8583-f1a40170752e","Type":"ContainerDied","Data":"76d722774285224a6de60017eb8318c4877ef97f9d26d58e45fd8422945c25d0"} Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.136428 4804 generic.go:334] "Generic (PLEG): container finished" podID="f3c65a30-a890-4d85-80ca-93f9420d5aa4" containerID="75003012d3c522e6a637465c31ac382126c2c3ac2eb1897adb68193823f330ce" exitCode=0 Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.136663 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6h6dp" event={"ID":"f3c65a30-a890-4d85-80ca-93f9420d5aa4","Type":"ContainerDied","Data":"75003012d3c522e6a637465c31ac382126c2c3ac2eb1897adb68193823f330ce"} Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.146575 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc","Type":"ContainerStarted","Data":"7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293"} Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.149615 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cc2e7136-825b-4608-a106-944f359c7369","Type":"ContainerStarted","Data":"19f884a664455e9314665febd087d4c25c36dd754e8ab45ab4f158f1edbff08d"} Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.151909 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"52f268a5-3c72-4655-bb36-823c34e5312d","Type":"ContainerStarted","Data":"c0cebc094d70bff51090146dd4586fa1f95f69a8f0dce5560eb8a6ad904ae9aa"} Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.151940 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"52f268a5-3c72-4655-bb36-823c34e5312d","Type":"ContainerStarted","Data":"905daab68904cbf4ce0b38c94e620f7c4eb4d2d220a0a793f2285c0b9e8354ea"} Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.187332 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.187307259 podStartE2EDuration="5.187307259s" podCreationTimestamp="2026-02-17 13:47:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:59.17303997 +0000 UTC m=+1353.284459307" watchObservedRunningTime="2026-02-17 13:47:59.187307259 +0000 UTC m=+1353.298726596" Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.863545 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-582lj" Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.894705 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.8946875949999997 podStartE2EDuration="3.894687595s" podCreationTimestamp="2026-02-17 13:47:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:59.203975383 +0000 UTC m=+1353.315394720" watchObservedRunningTime="2026-02-17 13:47:59.894687595 +0000 UTC m=+1354.006106922" Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.920311 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nn6tq" Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.954341 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6388-account-create-update-skdjv" Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.973969 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-570c-account-create-update-48hmw" Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.982763 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6h6dp" Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.993738 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.000859 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjp5b\" (UniqueName: \"kubernetes.io/projected/1517f905-d980-43be-8583-f1a40170752e-kube-api-access-rjp5b\") pod \"1517f905-d980-43be-8583-f1a40170752e\" (UID: \"1517f905-d980-43be-8583-f1a40170752e\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.001113 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1517f905-d980-43be-8583-f1a40170752e-operator-scripts\") pod \"1517f905-d980-43be-8583-f1a40170752e\" (UID: \"1517f905-d980-43be-8583-f1a40170752e\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.001900 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1517f905-d980-43be-8583-f1a40170752e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1517f905-d980-43be-8583-f1a40170752e" (UID: "1517f905-d980-43be-8583-f1a40170752e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.015757 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1517f905-d980-43be-8583-f1a40170752e-kube-api-access-rjp5b" (OuterVolumeSpecName: "kube-api-access-rjp5b") pod "1517f905-d980-43be-8583-f1a40170752e" (UID: "1517f905-d980-43be-8583-f1a40170752e"). InnerVolumeSpecName "kube-api-access-rjp5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.102477 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92d9081e-1e94-4244-b66a-34b05bc98f2d-operator-scripts\") pod \"92d9081e-1e94-4244-b66a-34b05bc98f2d\" (UID: \"92d9081e-1e94-4244-b66a-34b05bc98f2d\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.102524 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3c65a30-a890-4d85-80ca-93f9420d5aa4-operator-scripts\") pod \"f3c65a30-a890-4d85-80ca-93f9420d5aa4\" (UID: \"f3c65a30-a890-4d85-80ca-93f9420d5aa4\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.102549 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2rgm\" (UniqueName: \"kubernetes.io/projected/ccb316de-cd6e-4f79-9387-81f7a8add771-kube-api-access-p2rgm\") pod \"ccb316de-cd6e-4f79-9387-81f7a8add771\" (UID: \"ccb316de-cd6e-4f79-9387-81f7a8add771\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.102572 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mqt5\" (UniqueName: \"kubernetes.io/projected/f3c65a30-a890-4d85-80ca-93f9420d5aa4-kube-api-access-6mqt5\") pod \"f3c65a30-a890-4d85-80ca-93f9420d5aa4\" (UID: \"f3c65a30-a890-4d85-80ca-93f9420d5aa4\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.102633 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d23eb85-73ab-4049-b6be-486640c922e0-operator-scripts\") pod \"3d23eb85-73ab-4049-b6be-486640c922e0\" (UID: \"3d23eb85-73ab-4049-b6be-486640c922e0\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.102664 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjrg7\" (UniqueName: \"kubernetes.io/projected/92d9081e-1e94-4244-b66a-34b05bc98f2d-kube-api-access-mjrg7\") pod \"92d9081e-1e94-4244-b66a-34b05bc98f2d\" (UID: \"92d9081e-1e94-4244-b66a-34b05bc98f2d\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.102705 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fa81aac-8f7a-4947-9fbe-c38851b3652e-operator-scripts\") pod \"5fa81aac-8f7a-4947-9fbe-c38851b3652e\" (UID: \"5fa81aac-8f7a-4947-9fbe-c38851b3652e\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.102790 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt872\" (UniqueName: \"kubernetes.io/projected/3d23eb85-73ab-4049-b6be-486640c922e0-kube-api-access-qt872\") pod \"3d23eb85-73ab-4049-b6be-486640c922e0\" (UID: \"3d23eb85-73ab-4049-b6be-486640c922e0\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.102811 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccb316de-cd6e-4f79-9387-81f7a8add771-operator-scripts\") pod \"ccb316de-cd6e-4f79-9387-81f7a8add771\" (UID: \"ccb316de-cd6e-4f79-9387-81f7a8add771\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.102911 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzbw2\" (UniqueName: \"kubernetes.io/projected/5fa81aac-8f7a-4947-9fbe-c38851b3652e-kube-api-access-zzbw2\") pod \"5fa81aac-8f7a-4947-9fbe-c38851b3652e\" (UID: \"5fa81aac-8f7a-4947-9fbe-c38851b3652e\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.103361 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1517f905-d980-43be-8583-f1a40170752e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.103375 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjp5b\" (UniqueName: \"kubernetes.io/projected/1517f905-d980-43be-8583-f1a40170752e-kube-api-access-rjp5b\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.103767 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d23eb85-73ab-4049-b6be-486640c922e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d23eb85-73ab-4049-b6be-486640c922e0" (UID: "3d23eb85-73ab-4049-b6be-486640c922e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.103797 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92d9081e-1e94-4244-b66a-34b05bc98f2d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "92d9081e-1e94-4244-b66a-34b05bc98f2d" (UID: "92d9081e-1e94-4244-b66a-34b05bc98f2d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.103784 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fa81aac-8f7a-4947-9fbe-c38851b3652e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5fa81aac-8f7a-4947-9fbe-c38851b3652e" (UID: "5fa81aac-8f7a-4947-9fbe-c38851b3652e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.104146 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccb316de-cd6e-4f79-9387-81f7a8add771-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ccb316de-cd6e-4f79-9387-81f7a8add771" (UID: "ccb316de-cd6e-4f79-9387-81f7a8add771"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.104997 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3c65a30-a890-4d85-80ca-93f9420d5aa4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f3c65a30-a890-4d85-80ca-93f9420d5aa4" (UID: "f3c65a30-a890-4d85-80ca-93f9420d5aa4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.109798 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fa81aac-8f7a-4947-9fbe-c38851b3652e-kube-api-access-zzbw2" (OuterVolumeSpecName: "kube-api-access-zzbw2") pod "5fa81aac-8f7a-4947-9fbe-c38851b3652e" (UID: "5fa81aac-8f7a-4947-9fbe-c38851b3652e"). InnerVolumeSpecName "kube-api-access-zzbw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.114467 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92d9081e-1e94-4244-b66a-34b05bc98f2d-kube-api-access-mjrg7" (OuterVolumeSpecName: "kube-api-access-mjrg7") pod "92d9081e-1e94-4244-b66a-34b05bc98f2d" (UID: "92d9081e-1e94-4244-b66a-34b05bc98f2d"). InnerVolumeSpecName "kube-api-access-mjrg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.114580 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d23eb85-73ab-4049-b6be-486640c922e0-kube-api-access-qt872" (OuterVolumeSpecName: "kube-api-access-qt872") pod "3d23eb85-73ab-4049-b6be-486640c922e0" (UID: "3d23eb85-73ab-4049-b6be-486640c922e0"). InnerVolumeSpecName "kube-api-access-qt872". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.117007 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccb316de-cd6e-4f79-9387-81f7a8add771-kube-api-access-p2rgm" (OuterVolumeSpecName: "kube-api-access-p2rgm") pod "ccb316de-cd6e-4f79-9387-81f7a8add771" (UID: "ccb316de-cd6e-4f79-9387-81f7a8add771"). InnerVolumeSpecName "kube-api-access-p2rgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.117829 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3c65a30-a890-4d85-80ca-93f9420d5aa4-kube-api-access-6mqt5" (OuterVolumeSpecName: "kube-api-access-6mqt5") pod "f3c65a30-a890-4d85-80ca-93f9420d5aa4" (UID: "f3c65a30-a890-4d85-80ca-93f9420d5aa4"). InnerVolumeSpecName "kube-api-access-6mqt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.167595 4804 generic.go:334] "Generic (PLEG): container finished" podID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" containerID="d85afc401ad87104d844d4c1c5c56bfe2224eb996820680ca9a6f48ab88469e3" exitCode=137 Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.167668 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58989b55cb-zjfvf" event={"ID":"85415d6a-8a5f-4b65-b182-2bfe221e8eee","Type":"ContainerDied","Data":"d85afc401ad87104d844d4c1c5c56bfe2224eb996820680ca9a6f48ab88469e3"} Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.170555 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-570c-account-create-update-48hmw" event={"ID":"ccb316de-cd6e-4f79-9387-81f7a8add771","Type":"ContainerDied","Data":"199b20de65a928c87690cfbe6b6a25c5d3467e26b72add5ce1fab6172ff92b03"} Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.170587 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="199b20de65a928c87690cfbe6b6a25c5d3467e26b72add5ce1fab6172ff92b03" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.170644 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-570c-account-create-update-48hmw" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.174515 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-582lj" event={"ID":"1517f905-d980-43be-8583-f1a40170752e","Type":"ContainerDied","Data":"a5ffec33f37d9af4010f0839f331e730d0f7b8e15e52fe90f544650f10da490a"} Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.174543 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5ffec33f37d9af4010f0839f331e730d0f7b8e15e52fe90f544650f10da490a" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.174598 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-582lj" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.180813 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nn6tq" event={"ID":"5fa81aac-8f7a-4947-9fbe-c38851b3652e","Type":"ContainerDied","Data":"b39e64891804c87744085d0038f167adab37c91526911b1d675fc165390c9ae6"} Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.180849 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b39e64891804c87744085d0038f167adab37c91526911b1d675fc165390c9ae6" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.180922 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nn6tq" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.185891 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6h6dp" event={"ID":"f3c65a30-a890-4d85-80ca-93f9420d5aa4","Type":"ContainerDied","Data":"7fc6124a6d90d9e051ea54c2356413344027c38ef2bbf16638c22f2aa3317a37"} Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.185898 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6h6dp" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.185925 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fc6124a6d90d9e051ea54c2356413344027c38ef2bbf16638c22f2aa3317a37" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.191494 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6388-account-create-update-skdjv" event={"ID":"92d9081e-1e94-4244-b66a-34b05bc98f2d","Type":"ContainerDied","Data":"0430e4597f7a0ce9032791766bb4dd9708a3c867ce5a3db25a833e2a1bde6abd"} Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.191522 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0430e4597f7a0ce9032791766bb4dd9708a3c867ce5a3db25a833e2a1bde6abd" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.191687 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6388-account-create-update-skdjv" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.204925 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjrg7\" (UniqueName: \"kubernetes.io/projected/92d9081e-1e94-4244-b66a-34b05bc98f2d-kube-api-access-mjrg7\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.204956 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fa81aac-8f7a-4947-9fbe-c38851b3652e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.204966 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt872\" (UniqueName: \"kubernetes.io/projected/3d23eb85-73ab-4049-b6be-486640c922e0-kube-api-access-qt872\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.204975 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccb316de-cd6e-4f79-9387-81f7a8add771-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.204985 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzbw2\" (UniqueName: \"kubernetes.io/projected/5fa81aac-8f7a-4947-9fbe-c38851b3652e-kube-api-access-zzbw2\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.205020 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92d9081e-1e94-4244-b66a-34b05bc98f2d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.205029 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3c65a30-a890-4d85-80ca-93f9420d5aa4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.205038 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2rgm\" (UniqueName: \"kubernetes.io/projected/ccb316de-cd6e-4f79-9387-81f7a8add771-kube-api-access-p2rgm\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.205046 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mqt5\" (UniqueName: \"kubernetes.io/projected/f3c65a30-a890-4d85-80ca-93f9420d5aa4-kube-api-access-6mqt5\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.205055 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d23eb85-73ab-4049-b6be-486640c922e0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.207626 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc","Type":"ContainerStarted","Data":"b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270"} Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.213920 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" event={"ID":"3d23eb85-73ab-4049-b6be-486640c922e0","Type":"ContainerDied","Data":"c524172161ffac83a0b6e7a5805c119f237374e27cb6f6b470e9d29ed3840c55"} Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.213989 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c524172161ffac83a0b6e7a5805c119f237374e27cb6f6b470e9d29ed3840c55" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.213938 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.288940 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.406981 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85415d6a-8a5f-4b65-b182-2bfe221e8eee-logs\") pod \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.407144 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-scripts\") pod \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.407227 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-combined-ca-bundle\") pod \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.407272 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-tls-certs\") pod \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.407318 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-secret-key\") pod \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.407422 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wfg7\" (UniqueName: \"kubernetes.io/projected/85415d6a-8a5f-4b65-b182-2bfe221e8eee-kube-api-access-6wfg7\") pod \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.407511 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-config-data\") pod \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.407588 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85415d6a-8a5f-4b65-b182-2bfe221e8eee-logs" (OuterVolumeSpecName: "logs") pod "85415d6a-8a5f-4b65-b182-2bfe221e8eee" (UID: "85415d6a-8a5f-4b65-b182-2bfe221e8eee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.408117 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85415d6a-8a5f-4b65-b182-2bfe221e8eee-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.414664 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "85415d6a-8a5f-4b65-b182-2bfe221e8eee" (UID: "85415d6a-8a5f-4b65-b182-2bfe221e8eee"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.416284 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85415d6a-8a5f-4b65-b182-2bfe221e8eee-kube-api-access-6wfg7" (OuterVolumeSpecName: "kube-api-access-6wfg7") pod "85415d6a-8a5f-4b65-b182-2bfe221e8eee" (UID: "85415d6a-8a5f-4b65-b182-2bfe221e8eee"). InnerVolumeSpecName "kube-api-access-6wfg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.438937 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-scripts" (OuterVolumeSpecName: "scripts") pod "85415d6a-8a5f-4b65-b182-2bfe221e8eee" (UID: "85415d6a-8a5f-4b65-b182-2bfe221e8eee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.452316 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85415d6a-8a5f-4b65-b182-2bfe221e8eee" (UID: "85415d6a-8a5f-4b65-b182-2bfe221e8eee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.469599 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-config-data" (OuterVolumeSpecName: "config-data") pod "85415d6a-8a5f-4b65-b182-2bfe221e8eee" (UID: "85415d6a-8a5f-4b65-b182-2bfe221e8eee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.479571 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "85415d6a-8a5f-4b65-b182-2bfe221e8eee" (UID: "85415d6a-8a5f-4b65-b182-2bfe221e8eee"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.510318 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.510357 4804 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.510371 4804 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.510383 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wfg7\" (UniqueName: \"kubernetes.io/projected/85415d6a-8a5f-4b65-b182-2bfe221e8eee-kube-api-access-6wfg7\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.510401 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.510412 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.913119 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.942005 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.152942 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.228078 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc","Type":"ContainerStarted","Data":"0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6"} Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.229724 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="ceilometer-central-agent" containerID="cri-o://3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256" gracePeriod=30 Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.229877 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.230055 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="proxy-httpd" containerID="cri-o://0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6" gracePeriod=30 Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.230221 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="sg-core" containerID="cri-o://b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270" gracePeriod=30 Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.230272 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="ceilometer-notification-agent" containerID="cri-o://7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293" gracePeriod=30 Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.240177 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58989b55cb-zjfvf" event={"ID":"85415d6a-8a5f-4b65-b182-2bfe221e8eee","Type":"ContainerDied","Data":"70f335cc0aa83fd894a693104e67ff9d41e07158faf0aa4fa4d67a39b59c2aa3"} Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.240927 4804 scope.go:117] "RemoveContainer" containerID="c565845aca9ef2b15231e4cf93626b2f7262c579528562e984d56c20dda93983" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.241091 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.248759 4804 generic.go:334] "Generic (PLEG): container finished" podID="a2f2352e-7e9b-439f-be3c-b48b70681658" containerID="8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef" exitCode=0 Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.249556 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.249878 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547f989fd6-rqkvc" event={"ID":"a2f2352e-7e9b-439f-be3c-b48b70681658","Type":"ContainerDied","Data":"8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef"} Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.249904 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547f989fd6-rqkvc" event={"ID":"a2f2352e-7e9b-439f-be3c-b48b70681658","Type":"ContainerDied","Data":"f722d26b35de998b04775d73a392cd120313a641cde842ae74275d679995720d"} Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.265799 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.5663279660000002 podStartE2EDuration="9.265777123s" podCreationTimestamp="2026-02-17 13:47:52 +0000 UTC" firstStartedPulling="2026-02-17 13:47:53.983558937 +0000 UTC m=+1348.094978274" lastFinishedPulling="2026-02-17 13:48:00.683008094 +0000 UTC m=+1354.794427431" observedRunningTime="2026-02-17 13:48:01.255086387 +0000 UTC m=+1355.366505734" watchObservedRunningTime="2026-02-17 13:48:01.265777123 +0000 UTC m=+1355.377196470" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.278649 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58989b55cb-zjfvf"] Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.295573 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-58989b55cb-zjfvf"] Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.332862 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-httpd-config\") pod \"a2f2352e-7e9b-439f-be3c-b48b70681658\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.333043 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-combined-ca-bundle\") pod \"a2f2352e-7e9b-439f-be3c-b48b70681658\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.333127 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x29pc\" (UniqueName: \"kubernetes.io/projected/a2f2352e-7e9b-439f-be3c-b48b70681658-kube-api-access-x29pc\") pod \"a2f2352e-7e9b-439f-be3c-b48b70681658\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.333182 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-ovndb-tls-certs\") pod \"a2f2352e-7e9b-439f-be3c-b48b70681658\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.333226 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-config\") pod \"a2f2352e-7e9b-439f-be3c-b48b70681658\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.337996 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2f2352e-7e9b-439f-be3c-b48b70681658-kube-api-access-x29pc" (OuterVolumeSpecName: "kube-api-access-x29pc") pod "a2f2352e-7e9b-439f-be3c-b48b70681658" (UID: "a2f2352e-7e9b-439f-be3c-b48b70681658"). InnerVolumeSpecName "kube-api-access-x29pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.341805 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a2f2352e-7e9b-439f-be3c-b48b70681658" (UID: "a2f2352e-7e9b-439f-be3c-b48b70681658"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.389582 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2f2352e-7e9b-439f-be3c-b48b70681658" (UID: "a2f2352e-7e9b-439f-be3c-b48b70681658"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.398019 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-config" (OuterVolumeSpecName: "config") pod "a2f2352e-7e9b-439f-be3c-b48b70681658" (UID: "a2f2352e-7e9b-439f-be3c-b48b70681658"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.420289 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a2f2352e-7e9b-439f-be3c-b48b70681658" (UID: "a2f2352e-7e9b-439f-be3c-b48b70681658"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.430043 4804 scope.go:117] "RemoveContainer" containerID="d85afc401ad87104d844d4c1c5c56bfe2224eb996820680ca9a6f48ab88469e3" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.435149 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.435186 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x29pc\" (UniqueName: \"kubernetes.io/projected/a2f2352e-7e9b-439f-be3c-b48b70681658-kube-api-access-x29pc\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.435220 4804 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.435234 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.435245 4804 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.541128 4804 scope.go:117] "RemoveContainer" containerID="60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.616143 4804 scope.go:117] "RemoveContainer" containerID="8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.623953 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-547f989fd6-rqkvc"] Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.636638 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-547f989fd6-rqkvc"] Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.637271 4804 scope.go:117] "RemoveContainer" containerID="60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad" Feb 17 13:48:01 crc kubenswrapper[4804]: E0217 13:48:01.637773 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad\": container with ID starting with 60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad not found: ID does not exist" containerID="60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.637826 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad"} err="failed to get container status \"60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad\": rpc error: code = NotFound desc = could not find container \"60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad\": container with ID starting with 60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad not found: ID does not exist" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.637856 4804 scope.go:117] "RemoveContainer" containerID="8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef" Feb 17 13:48:01 crc kubenswrapper[4804]: E0217 13:48:01.638138 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef\": container with ID starting with 8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef not found: ID does not exist" containerID="8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.638247 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef"} err="failed to get container status \"8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef\": rpc error: code = NotFound desc = could not find container \"8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef\": container with ID starting with 8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef not found: ID does not exist" Feb 17 13:48:02 crc kubenswrapper[4804]: I0217 13:48:02.259904 4804 generic.go:334] "Generic (PLEG): container finished" podID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerID="0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6" exitCode=0 Feb 17 13:48:02 crc kubenswrapper[4804]: I0217 13:48:02.259939 4804 generic.go:334] "Generic (PLEG): container finished" podID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerID="b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270" exitCode=2 Feb 17 13:48:02 crc kubenswrapper[4804]: I0217 13:48:02.259947 4804 generic.go:334] "Generic (PLEG): container finished" podID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerID="7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293" exitCode=0 Feb 17 13:48:02 crc kubenswrapper[4804]: I0217 13:48:02.259992 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc","Type":"ContainerDied","Data":"0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6"} Feb 17 13:48:02 crc kubenswrapper[4804]: I0217 13:48:02.260049 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc","Type":"ContainerDied","Data":"b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270"} Feb 17 13:48:02 crc kubenswrapper[4804]: I0217 13:48:02.260064 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc","Type":"ContainerDied","Data":"7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293"} Feb 17 13:48:02 crc kubenswrapper[4804]: I0217 13:48:02.586272 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" path="/var/lib/kubelet/pods/85415d6a-8a5f-4b65-b182-2bfe221e8eee/volumes" Feb 17 13:48:02 crc kubenswrapper[4804]: I0217 13:48:02.587131 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2f2352e-7e9b-439f-be3c-b48b70681658" path="/var/lib/kubelet/pods/a2f2352e-7e9b-439f-be3c-b48b70681658/volumes" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.134258 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.274662 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-combined-ca-bundle\") pod \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.274705 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-scripts\") pod \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.275876 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-run-httpd\") pod \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.275942 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-sg-core-conf-yaml\") pod \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.276084 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-config-data\") pod \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.276151 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-log-httpd\") pod \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.276185 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr5gm\" (UniqueName: \"kubernetes.io/projected/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-kube-api-access-hr5gm\") pod \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.276499 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" (UID: "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.277145 4804 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.277571 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" (UID: "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.284286 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-scripts" (OuterVolumeSpecName: "scripts") pod "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" (UID: "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.286145 4804 generic.go:334] "Generic (PLEG): container finished" podID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerID="3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256" exitCode=0 Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.286192 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc","Type":"ContainerDied","Data":"3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256"} Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.286243 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc","Type":"ContainerDied","Data":"b3be3c859965ba94fbed2ed95fd8afc727f019eeb34bd5903d88d5dbf4d77e51"} Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.286264 4804 scope.go:117] "RemoveContainer" containerID="0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.286487 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.296798 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-kube-api-access-hr5gm" (OuterVolumeSpecName: "kube-api-access-hr5gm") pod "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" (UID: "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc"). InnerVolumeSpecName "kube-api-access-hr5gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.300734 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" (UID: "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.343279 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" (UID: "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.379463 4804 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.379731 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr5gm\" (UniqueName: \"kubernetes.io/projected/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-kube-api-access-hr5gm\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.379804 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.379874 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.379973 4804 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.391028 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-config-data" (OuterVolumeSpecName: "config-data") pod "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" (UID: "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.418589 4804 scope.go:117] "RemoveContainer" containerID="b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.454600 4804 scope.go:117] "RemoveContainer" containerID="7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.482163 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.496565 4804 scope.go:117] "RemoveContainer" containerID="3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.516614 4804 scope.go:117] "RemoveContainer" containerID="0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.517240 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6\": container with ID starting with 0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6 not found: ID does not exist" containerID="0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.517276 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6"} err="failed to get container status \"0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6\": rpc error: code = NotFound desc = could not find container \"0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6\": container with ID starting with 0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6 not found: ID does not exist" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.517330 4804 scope.go:117] "RemoveContainer" containerID="b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.517626 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270\": container with ID starting with b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270 not found: ID does not exist" containerID="b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.517673 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270"} err="failed to get container status \"b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270\": rpc error: code = NotFound desc = could not find container \"b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270\": container with ID starting with b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270 not found: ID does not exist" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.517692 4804 scope.go:117] "RemoveContainer" containerID="7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.517989 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293\": container with ID starting with 7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293 not found: ID does not exist" containerID="7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.518066 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293"} err="failed to get container status \"7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293\": rpc error: code = NotFound desc = could not find container \"7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293\": container with ID starting with 7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293 not found: ID does not exist" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.518086 4804 scope.go:117] "RemoveContainer" containerID="3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.518354 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256\": container with ID starting with 3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256 not found: ID does not exist" containerID="3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.518383 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256"} err="failed to get container status \"3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256\": rpc error: code = NotFound desc = could not find container \"3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256\": container with ID starting with 3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256 not found: ID does not exist" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.625027 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.632529 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662292 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662763 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="ceilometer-notification-agent" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662780 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="ceilometer-notification-agent" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662799 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c65a30-a890-4d85-80ca-93f9420d5aa4" containerName="mariadb-database-create" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662813 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c65a30-a890-4d85-80ca-93f9420d5aa4" containerName="mariadb-database-create" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662826 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d23eb85-73ab-4049-b6be-486640c922e0" containerName="mariadb-account-create-update" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662832 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d23eb85-73ab-4049-b6be-486640c922e0" containerName="mariadb-account-create-update" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662840 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fa81aac-8f7a-4947-9fbe-c38851b3652e" containerName="mariadb-database-create" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662846 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fa81aac-8f7a-4947-9fbe-c38851b3652e" containerName="mariadb-database-create" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662857 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb316de-cd6e-4f79-9387-81f7a8add771" containerName="mariadb-account-create-update" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662863 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb316de-cd6e-4f79-9387-81f7a8add771" containerName="mariadb-account-create-update" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662878 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="proxy-httpd" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662883 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="proxy-httpd" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662890 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" containerName="horizon-log" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662895 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" containerName="horizon-log" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662910 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="sg-core" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662915 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="sg-core" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662924 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1517f905-d980-43be-8583-f1a40170752e" containerName="mariadb-database-create" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662930 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="1517f905-d980-43be-8583-f1a40170752e" containerName="mariadb-database-create" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662939 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92d9081e-1e94-4244-b66a-34b05bc98f2d" containerName="mariadb-account-create-update" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662945 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d9081e-1e94-4244-b66a-34b05bc98f2d" containerName="mariadb-account-create-update" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662956 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2f2352e-7e9b-439f-be3c-b48b70681658" containerName="neutron-httpd" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662962 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2f2352e-7e9b-439f-be3c-b48b70681658" containerName="neutron-httpd" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662972 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" containerName="horizon" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662977 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" containerName="horizon" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662988 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2f2352e-7e9b-439f-be3c-b48b70681658" containerName="neutron-api" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662994 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2f2352e-7e9b-439f-be3c-b48b70681658" containerName="neutron-api" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.663008 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="ceilometer-central-agent" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663014 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="ceilometer-central-agent" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663194 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="ceilometer-notification-agent" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663224 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="1517f905-d980-43be-8583-f1a40170752e" containerName="mariadb-database-create" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663236 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d23eb85-73ab-4049-b6be-486640c922e0" containerName="mariadb-account-create-update" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663251 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb316de-cd6e-4f79-9387-81f7a8add771" containerName="mariadb-account-create-update" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663265 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="92d9081e-1e94-4244-b66a-34b05bc98f2d" containerName="mariadb-account-create-update" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663275 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" containerName="horizon" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663287 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2f2352e-7e9b-439f-be3c-b48b70681658" containerName="neutron-httpd" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663298 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="sg-core" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663312 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3c65a30-a890-4d85-80ca-93f9420d5aa4" containerName="mariadb-database-create" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663327 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="proxy-httpd" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663340 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="ceilometer-central-agent" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663353 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2f2352e-7e9b-439f-be3c-b48b70681658" containerName="neutron-api" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663366 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" containerName="horizon-log" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663372 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fa81aac-8f7a-4947-9fbe-c38851b3652e" containerName="mariadb-database-create" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.669673 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.672906 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.673694 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.673891 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.787259 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-scripts\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.787339 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-config-data\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.787514 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-log-httpd\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.787654 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-run-httpd\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.787845 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kzxj\" (UniqueName: \"kubernetes.io/projected/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-kube-api-access-8kzxj\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.787933 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.788139 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.889509 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.889743 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-scripts\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.889792 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-config-data\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.889816 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-log-httpd\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.889841 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-run-httpd\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.889888 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kzxj\" (UniqueName: \"kubernetes.io/projected/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-kube-api-access-8kzxj\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.889909 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.890980 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-run-httpd\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.891029 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-log-httpd\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.894361 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.896184 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-config-data\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.902034 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-scripts\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.904066 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.908666 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kzxj\" (UniqueName: \"kubernetes.io/projected/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-kube-api-access-8kzxj\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.991022 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.353486 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ndx9s"] Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.355169 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.358257 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.358297 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rxrxn" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.358599 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.366472 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ndx9s"] Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.502252 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-config-data\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.502312 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.502338 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qltdm\" (UniqueName: \"kubernetes.io/projected/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-kube-api-access-qltdm\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.502376 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-scripts\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.512487 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:04 crc kubenswrapper[4804]: W0217 13:48:04.526090 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f98588b_4340_42cc_af47_1f1d5c0c6d0f.slice/crio-f9d41a7bd6eeacec0471c386bde6cc241d19539177f7e3af7b07ac298754582a WatchSource:0}: Error finding container f9d41a7bd6eeacec0471c386bde6cc241d19539177f7e3af7b07ac298754582a: Status 404 returned error can't find the container with id f9d41a7bd6eeacec0471c386bde6cc241d19539177f7e3af7b07ac298754582a Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.582964 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" path="/var/lib/kubelet/pods/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc/volumes" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.603840 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-config-data\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.603896 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.603922 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qltdm\" (UniqueName: \"kubernetes.io/projected/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-kube-api-access-qltdm\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.603956 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-scripts\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.610144 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.610454 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-config-data\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.610557 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-scripts\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.619513 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qltdm\" (UniqueName: \"kubernetes.io/projected/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-kube-api-access-qltdm\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.678879 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:05 crc kubenswrapper[4804]: I0217 13:48:05.136940 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ndx9s"] Feb 17 13:48:05 crc kubenswrapper[4804]: I0217 13:48:05.307384 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f98588b-4340-42cc-af47-1f1d5c0c6d0f","Type":"ContainerStarted","Data":"e05005cc28b8a5781793808dbb22035a5311dc438a83d9d65723b579e42deefb"} Feb 17 13:48:05 crc kubenswrapper[4804]: I0217 13:48:05.307433 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f98588b-4340-42cc-af47-1f1d5c0c6d0f","Type":"ContainerStarted","Data":"f9d41a7bd6eeacec0471c386bde6cc241d19539177f7e3af7b07ac298754582a"} Feb 17 13:48:05 crc kubenswrapper[4804]: I0217 13:48:05.308813 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ndx9s" event={"ID":"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53","Type":"ContainerStarted","Data":"0b89b75bf22a2ce250e51320c1e06b88aa347683ca582f8db810c8648296646c"} Feb 17 13:48:05 crc kubenswrapper[4804]: I0217 13:48:05.393402 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 13:48:05 crc kubenswrapper[4804]: I0217 13:48:05.393455 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 13:48:05 crc kubenswrapper[4804]: I0217 13:48:05.431959 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 13:48:05 crc kubenswrapper[4804]: I0217 13:48:05.444475 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 13:48:06 crc kubenswrapper[4804]: I0217 13:48:06.176471 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:06 crc kubenswrapper[4804]: I0217 13:48:06.334070 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f98588b-4340-42cc-af47-1f1d5c0c6d0f","Type":"ContainerStarted","Data":"706d2ec04359393f0bf73ae7699e4341238c751922d5d96808f68c292e5e4310"} Feb 17 13:48:06 crc kubenswrapper[4804]: I0217 13:48:06.334544 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 13:48:06 crc kubenswrapper[4804]: I0217 13:48:06.334591 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 13:48:06 crc kubenswrapper[4804]: I0217 13:48:06.416150 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 13:48:06 crc kubenswrapper[4804]: I0217 13:48:06.417417 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 13:48:06 crc kubenswrapper[4804]: I0217 13:48:06.466902 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 13:48:06 crc kubenswrapper[4804]: I0217 13:48:06.501858 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 13:48:07 crc kubenswrapper[4804]: I0217 13:48:07.347762 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f98588b-4340-42cc-af47-1f1d5c0c6d0f","Type":"ContainerStarted","Data":"4becf17ec69ec73b9e5515e279e099dcb395885b98691d86a0bf3f748a338115"} Feb 17 13:48:07 crc kubenswrapper[4804]: I0217 13:48:07.348138 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 13:48:07 crc kubenswrapper[4804]: I0217 13:48:07.348158 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 13:48:08 crc kubenswrapper[4804]: I0217 13:48:08.368817 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="ceilometer-central-agent" containerID="cri-o://e05005cc28b8a5781793808dbb22035a5311dc438a83d9d65723b579e42deefb" gracePeriod=30 Feb 17 13:48:08 crc kubenswrapper[4804]: I0217 13:48:08.369330 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f98588b-4340-42cc-af47-1f1d5c0c6d0f","Type":"ContainerStarted","Data":"8477f92c817b81c812d2b022582de811c55e1932bdd1e29d5f6293940edfafa2"} Feb 17 13:48:08 crc kubenswrapper[4804]: I0217 13:48:08.369372 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 13:48:08 crc kubenswrapper[4804]: I0217 13:48:08.369390 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="sg-core" containerID="cri-o://4becf17ec69ec73b9e5515e279e099dcb395885b98691d86a0bf3f748a338115" gracePeriod=30 Feb 17 13:48:08 crc kubenswrapper[4804]: I0217 13:48:08.369447 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="proxy-httpd" containerID="cri-o://8477f92c817b81c812d2b022582de811c55e1932bdd1e29d5f6293940edfafa2" gracePeriod=30 Feb 17 13:48:08 crc kubenswrapper[4804]: I0217 13:48:08.369415 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="ceilometer-notification-agent" containerID="cri-o://706d2ec04359393f0bf73ae7699e4341238c751922d5d96808f68c292e5e4310" gracePeriod=30 Feb 17 13:48:08 crc kubenswrapper[4804]: I0217 13:48:08.397185 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.141068828 podStartE2EDuration="5.397167779s" podCreationTimestamp="2026-02-17 13:48:03 +0000 UTC" firstStartedPulling="2026-02-17 13:48:04.528859463 +0000 UTC m=+1358.640278800" lastFinishedPulling="2026-02-17 13:48:07.784958414 +0000 UTC m=+1361.896377751" observedRunningTime="2026-02-17 13:48:08.393774452 +0000 UTC m=+1362.505193789" watchObservedRunningTime="2026-02-17 13:48:08.397167779 +0000 UTC m=+1362.508587116" Feb 17 13:48:08 crc kubenswrapper[4804]: I0217 13:48:08.552032 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 13:48:08 crc kubenswrapper[4804]: I0217 13:48:08.552782 4804 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 13:48:08 crc kubenswrapper[4804]: I0217 13:48:08.566566 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 13:48:09 crc kubenswrapper[4804]: I0217 13:48:09.405437 4804 generic.go:334] "Generic (PLEG): container finished" podID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerID="8477f92c817b81c812d2b022582de811c55e1932bdd1e29d5f6293940edfafa2" exitCode=0 Feb 17 13:48:09 crc kubenswrapper[4804]: I0217 13:48:09.405482 4804 generic.go:334] "Generic (PLEG): container finished" podID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerID="4becf17ec69ec73b9e5515e279e099dcb395885b98691d86a0bf3f748a338115" exitCode=2 Feb 17 13:48:09 crc kubenswrapper[4804]: I0217 13:48:09.405492 4804 generic.go:334] "Generic (PLEG): container finished" podID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerID="706d2ec04359393f0bf73ae7699e4341238c751922d5d96808f68c292e5e4310" exitCode=0 Feb 17 13:48:09 crc kubenswrapper[4804]: I0217 13:48:09.406404 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f98588b-4340-42cc-af47-1f1d5c0c6d0f","Type":"ContainerDied","Data":"8477f92c817b81c812d2b022582de811c55e1932bdd1e29d5f6293940edfafa2"} Feb 17 13:48:09 crc kubenswrapper[4804]: I0217 13:48:09.406490 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f98588b-4340-42cc-af47-1f1d5c0c6d0f","Type":"ContainerDied","Data":"4becf17ec69ec73b9e5515e279e099dcb395885b98691d86a0bf3f748a338115"} Feb 17 13:48:09 crc kubenswrapper[4804]: I0217 13:48:09.406507 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f98588b-4340-42cc-af47-1f1d5c0c6d0f","Type":"ContainerDied","Data":"706d2ec04359393f0bf73ae7699e4341238c751922d5d96808f68c292e5e4310"} Feb 17 13:48:09 crc kubenswrapper[4804]: I0217 13:48:09.951886 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 13:48:09 crc kubenswrapper[4804]: I0217 13:48:09.951988 4804 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 13:48:10 crc kubenswrapper[4804]: I0217 13:48:10.378747 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 13:48:16 crc kubenswrapper[4804]: I0217 13:48:16.503983 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ndx9s" event={"ID":"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53","Type":"ContainerStarted","Data":"fc70787b15c0217130d5a14ec0e9948f9e8203a3e166dda3f2555ad7e07ed729"} Feb 17 13:48:16 crc kubenswrapper[4804]: I0217 13:48:16.528917 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-ndx9s" podStartSLOduration=2.14612883 podStartE2EDuration="12.528900646s" podCreationTimestamp="2026-02-17 13:48:04 +0000 UTC" firstStartedPulling="2026-02-17 13:48:05.151126483 +0000 UTC m=+1359.262545830" lastFinishedPulling="2026-02-17 13:48:15.533898309 +0000 UTC m=+1369.645317646" observedRunningTime="2026-02-17 13:48:16.52392651 +0000 UTC m=+1370.635345857" watchObservedRunningTime="2026-02-17 13:48:16.528900646 +0000 UTC m=+1370.640319983" Feb 17 13:48:18 crc kubenswrapper[4804]: I0217 13:48:18.524380 4804 generic.go:334] "Generic (PLEG): container finished" podID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerID="e05005cc28b8a5781793808dbb22035a5311dc438a83d9d65723b579e42deefb" exitCode=0 Feb 17 13:48:18 crc kubenswrapper[4804]: I0217 13:48:18.524594 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f98588b-4340-42cc-af47-1f1d5c0c6d0f","Type":"ContainerDied","Data":"e05005cc28b8a5781793808dbb22035a5311dc438a83d9d65723b579e42deefb"} Feb 17 13:48:18 crc kubenswrapper[4804]: I0217 13:48:18.888085 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.001274 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-log-httpd\") pod \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.001356 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-config-data\") pod \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.001395 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-sg-core-conf-yaml\") pod \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.001484 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-combined-ca-bundle\") pod \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.001562 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kzxj\" (UniqueName: \"kubernetes.io/projected/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-kube-api-access-8kzxj\") pod \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.001599 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-scripts\") pod \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.001634 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-run-httpd\") pod \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.002406 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2f98588b-4340-42cc-af47-1f1d5c0c6d0f" (UID: "2f98588b-4340-42cc-af47-1f1d5c0c6d0f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.002624 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2f98588b-4340-42cc-af47-1f1d5c0c6d0f" (UID: "2f98588b-4340-42cc-af47-1f1d5c0c6d0f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.007537 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-scripts" (OuterVolumeSpecName: "scripts") pod "2f98588b-4340-42cc-af47-1f1d5c0c6d0f" (UID: "2f98588b-4340-42cc-af47-1f1d5c0c6d0f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.020543 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-kube-api-access-8kzxj" (OuterVolumeSpecName: "kube-api-access-8kzxj") pod "2f98588b-4340-42cc-af47-1f1d5c0c6d0f" (UID: "2f98588b-4340-42cc-af47-1f1d5c0c6d0f"). InnerVolumeSpecName "kube-api-access-8kzxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.031429 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2f98588b-4340-42cc-af47-1f1d5c0c6d0f" (UID: "2f98588b-4340-42cc-af47-1f1d5c0c6d0f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.073485 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f98588b-4340-42cc-af47-1f1d5c0c6d0f" (UID: "2f98588b-4340-42cc-af47-1f1d5c0c6d0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.103728 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.103760 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kzxj\" (UniqueName: \"kubernetes.io/projected/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-kube-api-access-8kzxj\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.103771 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.103780 4804 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.103788 4804 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.103798 4804 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.106607 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-config-data" (OuterVolumeSpecName: "config-data") pod "2f98588b-4340-42cc-af47-1f1d5c0c6d0f" (UID: "2f98588b-4340-42cc-af47-1f1d5c0c6d0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.205645 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.537373 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f98588b-4340-42cc-af47-1f1d5c0c6d0f","Type":"ContainerDied","Data":"f9d41a7bd6eeacec0471c386bde6cc241d19539177f7e3af7b07ac298754582a"} Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.537451 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.537805 4804 scope.go:117] "RemoveContainer" containerID="8477f92c817b81c812d2b022582de811c55e1932bdd1e29d5f6293940edfafa2" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.562844 4804 scope.go:117] "RemoveContainer" containerID="4becf17ec69ec73b9e5515e279e099dcb395885b98691d86a0bf3f748a338115" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.576077 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.584002 4804 scope.go:117] "RemoveContainer" containerID="706d2ec04359393f0bf73ae7699e4341238c751922d5d96808f68c292e5e4310" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.584263 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.625233 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:19 crc kubenswrapper[4804]: E0217 13:48:19.625667 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="ceilometer-notification-agent" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.625684 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="ceilometer-notification-agent" Feb 17 13:48:19 crc kubenswrapper[4804]: E0217 13:48:19.625710 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="ceilometer-central-agent" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.625731 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="ceilometer-central-agent" Feb 17 13:48:19 crc kubenswrapper[4804]: E0217 13:48:19.625753 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="proxy-httpd" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.625759 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="proxy-httpd" Feb 17 13:48:19 crc kubenswrapper[4804]: E0217 13:48:19.625778 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="sg-core" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.625785 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="sg-core" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.625960 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="proxy-httpd" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.625980 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="ceilometer-notification-agent" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.625998 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="ceilometer-central-agent" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.626009 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="sg-core" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.627956 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.628103 4804 scope.go:117] "RemoveContainer" containerID="e05005cc28b8a5781793808dbb22035a5311dc438a83d9d65723b579e42deefb" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.631116 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.631362 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.634622 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.714032 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-log-httpd\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.714096 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.714166 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.714617 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-scripts\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.714814 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-854dv\" (UniqueName: \"kubernetes.io/projected/4de89973-4899-493b-aacb-b8c3b5c96b5d-kube-api-access-854dv\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.714918 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-run-httpd\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.715045 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-config-data\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.816624 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-854dv\" (UniqueName: \"kubernetes.io/projected/4de89973-4899-493b-aacb-b8c3b5c96b5d-kube-api-access-854dv\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.816683 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-run-httpd\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.816728 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-config-data\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.816821 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-log-httpd\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.816847 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.816879 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.816930 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-scripts\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.817658 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-run-httpd\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.817856 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-log-httpd\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.822336 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-config-data\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.822795 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.833857 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-scripts\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.834175 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.843234 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-854dv\" (UniqueName: \"kubernetes.io/projected/4de89973-4899-493b-aacb-b8c3b5c96b5d-kube-api-access-854dv\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.956344 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:20 crc kubenswrapper[4804]: W0217 13:48:20.484090 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4de89973_4899_493b_aacb_b8c3b5c96b5d.slice/crio-34cf4653df9d04a4b888d17573f24b70b1233173f57dd3617dc2eff2bc8c163b WatchSource:0}: Error finding container 34cf4653df9d04a4b888d17573f24b70b1233173f57dd3617dc2eff2bc8c163b: Status 404 returned error can't find the container with id 34cf4653df9d04a4b888d17573f24b70b1233173f57dd3617dc2eff2bc8c163b Feb 17 13:48:20 crc kubenswrapper[4804]: I0217 13:48:20.484539 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:20 crc kubenswrapper[4804]: I0217 13:48:20.546550 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4de89973-4899-493b-aacb-b8c3b5c96b5d","Type":"ContainerStarted","Data":"34cf4653df9d04a4b888d17573f24b70b1233173f57dd3617dc2eff2bc8c163b"} Feb 17 13:48:20 crc kubenswrapper[4804]: I0217 13:48:20.584154 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" path="/var/lib/kubelet/pods/2f98588b-4340-42cc-af47-1f1d5c0c6d0f/volumes" Feb 17 13:48:21 crc kubenswrapper[4804]: I0217 13:48:21.557922 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4de89973-4899-493b-aacb-b8c3b5c96b5d","Type":"ContainerStarted","Data":"86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6"} Feb 17 13:48:22 crc kubenswrapper[4804]: I0217 13:48:22.571853 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4de89973-4899-493b-aacb-b8c3b5c96b5d","Type":"ContainerStarted","Data":"d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab"} Feb 17 13:48:23 crc kubenswrapper[4804]: I0217 13:48:23.580960 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4de89973-4899-493b-aacb-b8c3b5c96b5d","Type":"ContainerStarted","Data":"76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07"} Feb 17 13:48:24 crc kubenswrapper[4804]: I0217 13:48:24.145574 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:24 crc kubenswrapper[4804]: I0217 13:48:24.592781 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4de89973-4899-493b-aacb-b8c3b5c96b5d","Type":"ContainerStarted","Data":"138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285"} Feb 17 13:48:24 crc kubenswrapper[4804]: I0217 13:48:24.593110 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 13:48:24 crc kubenswrapper[4804]: I0217 13:48:24.620340 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.248965435 podStartE2EDuration="5.620315749s" podCreationTimestamp="2026-02-17 13:48:19 +0000 UTC" firstStartedPulling="2026-02-17 13:48:20.486665673 +0000 UTC m=+1374.598085010" lastFinishedPulling="2026-02-17 13:48:23.858015977 +0000 UTC m=+1377.969435324" observedRunningTime="2026-02-17 13:48:24.612451272 +0000 UTC m=+1378.723870619" watchObservedRunningTime="2026-02-17 13:48:24.620315749 +0000 UTC m=+1378.731735086" Feb 17 13:48:25 crc kubenswrapper[4804]: I0217 13:48:25.606447 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="ceilometer-central-agent" containerID="cri-o://86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6" gracePeriod=30 Feb 17 13:48:25 crc kubenswrapper[4804]: I0217 13:48:25.607455 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="sg-core" containerID="cri-o://76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07" gracePeriod=30 Feb 17 13:48:25 crc kubenswrapper[4804]: I0217 13:48:25.607598 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="proxy-httpd" containerID="cri-o://138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285" gracePeriod=30 Feb 17 13:48:25 crc kubenswrapper[4804]: I0217 13:48:25.607672 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="ceilometer-notification-agent" containerID="cri-o://d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab" gracePeriod=30 Feb 17 13:48:26 crc kubenswrapper[4804]: I0217 13:48:26.617890 4804 generic.go:334] "Generic (PLEG): container finished" podID="20c077b5-d559-4c19-b8ee-f1b7ebf3fc53" containerID="fc70787b15c0217130d5a14ec0e9948f9e8203a3e166dda3f2555ad7e07ed729" exitCode=0 Feb 17 13:48:26 crc kubenswrapper[4804]: I0217 13:48:26.618098 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ndx9s" event={"ID":"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53","Type":"ContainerDied","Data":"fc70787b15c0217130d5a14ec0e9948f9e8203a3e166dda3f2555ad7e07ed729"} Feb 17 13:48:26 crc kubenswrapper[4804]: I0217 13:48:26.623989 4804 generic.go:334] "Generic (PLEG): container finished" podID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerID="138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285" exitCode=0 Feb 17 13:48:26 crc kubenswrapper[4804]: I0217 13:48:26.624031 4804 generic.go:334] "Generic (PLEG): container finished" podID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerID="76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07" exitCode=2 Feb 17 13:48:26 crc kubenswrapper[4804]: I0217 13:48:26.624047 4804 generic.go:334] "Generic (PLEG): container finished" podID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerID="d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab" exitCode=0 Feb 17 13:48:26 crc kubenswrapper[4804]: I0217 13:48:26.624023 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4de89973-4899-493b-aacb-b8c3b5c96b5d","Type":"ContainerDied","Data":"138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285"} Feb 17 13:48:26 crc kubenswrapper[4804]: I0217 13:48:26.624130 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4de89973-4899-493b-aacb-b8c3b5c96b5d","Type":"ContainerDied","Data":"76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07"} Feb 17 13:48:26 crc kubenswrapper[4804]: I0217 13:48:26.624167 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4de89973-4899-493b-aacb-b8c3b5c96b5d","Type":"ContainerDied","Data":"d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab"} Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.022278 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.067436 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qltdm\" (UniqueName: \"kubernetes.io/projected/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-kube-api-access-qltdm\") pod \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.067583 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-combined-ca-bundle\") pod \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.067675 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-scripts\") pod \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.067784 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-config-data\") pod \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.084467 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-kube-api-access-qltdm" (OuterVolumeSpecName: "kube-api-access-qltdm") pod "20c077b5-d559-4c19-b8ee-f1b7ebf3fc53" (UID: "20c077b5-d559-4c19-b8ee-f1b7ebf3fc53"). InnerVolumeSpecName "kube-api-access-qltdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.095146 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-scripts" (OuterVolumeSpecName: "scripts") pod "20c077b5-d559-4c19-b8ee-f1b7ebf3fc53" (UID: "20c077b5-d559-4c19-b8ee-f1b7ebf3fc53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.099044 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-config-data" (OuterVolumeSpecName: "config-data") pod "20c077b5-d559-4c19-b8ee-f1b7ebf3fc53" (UID: "20c077b5-d559-4c19-b8ee-f1b7ebf3fc53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.105776 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20c077b5-d559-4c19-b8ee-f1b7ebf3fc53" (UID: "20c077b5-d559-4c19-b8ee-f1b7ebf3fc53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.169670 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.169724 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qltdm\" (UniqueName: \"kubernetes.io/projected/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-kube-api-access-qltdm\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.169748 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.169764 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.649281 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.649473 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ndx9s" event={"ID":"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53","Type":"ContainerDied","Data":"0b89b75bf22a2ce250e51320c1e06b88aa347683ca582f8db810c8648296646c"} Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.649655 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b89b75bf22a2ce250e51320c1e06b88aa347683ca582f8db810c8648296646c" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.722188 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 13:48:28 crc kubenswrapper[4804]: E0217 13:48:28.722698 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20c077b5-d559-4c19-b8ee-f1b7ebf3fc53" containerName="nova-cell0-conductor-db-sync" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.722724 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="20c077b5-d559-4c19-b8ee-f1b7ebf3fc53" containerName="nova-cell0-conductor-db-sync" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.722960 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="20c077b5-d559-4c19-b8ee-f1b7ebf3fc53" containerName="nova-cell0-conductor-db-sync" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.723738 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.727940 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rxrxn" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.728994 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.734979 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.781446 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc78e86d-494e-417b-8569-b564cdbd069a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fc78e86d-494e-417b-8569-b564cdbd069a\") " pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.781640 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtz2x\" (UniqueName: \"kubernetes.io/projected/fc78e86d-494e-417b-8569-b564cdbd069a-kube-api-access-mtz2x\") pod \"nova-cell0-conductor-0\" (UID: \"fc78e86d-494e-417b-8569-b564cdbd069a\") " pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.781796 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc78e86d-494e-417b-8569-b564cdbd069a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fc78e86d-494e-417b-8569-b564cdbd069a\") " pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.883982 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtz2x\" (UniqueName: \"kubernetes.io/projected/fc78e86d-494e-417b-8569-b564cdbd069a-kube-api-access-mtz2x\") pod \"nova-cell0-conductor-0\" (UID: \"fc78e86d-494e-417b-8569-b564cdbd069a\") " pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.884089 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc78e86d-494e-417b-8569-b564cdbd069a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fc78e86d-494e-417b-8569-b564cdbd069a\") " pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.884171 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc78e86d-494e-417b-8569-b564cdbd069a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fc78e86d-494e-417b-8569-b564cdbd069a\") " pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.888061 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc78e86d-494e-417b-8569-b564cdbd069a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fc78e86d-494e-417b-8569-b564cdbd069a\") " pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.889462 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc78e86d-494e-417b-8569-b564cdbd069a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fc78e86d-494e-417b-8569-b564cdbd069a\") " pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.903975 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtz2x\" (UniqueName: \"kubernetes.io/projected/fc78e86d-494e-417b-8569-b564cdbd069a-kube-api-access-mtz2x\") pod \"nova-cell0-conductor-0\" (UID: \"fc78e86d-494e-417b-8569-b564cdbd069a\") " pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.041582 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.499778 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 13:48:29 crc kubenswrapper[4804]: W0217 13:48:29.510027 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc78e86d_494e_417b_8569_b564cdbd069a.slice/crio-2de277511dd91bf90e302e0eb111d11f5bcbb32d3c5a517d71ff8e3ffc46d240 WatchSource:0}: Error finding container 2de277511dd91bf90e302e0eb111d11f5bcbb32d3c5a517d71ff8e3ffc46d240: Status 404 returned error can't find the container with id 2de277511dd91bf90e302e0eb111d11f5bcbb32d3c5a517d71ff8e3ffc46d240 Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.546713 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.596673 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-sg-core-conf-yaml\") pod \"4de89973-4899-493b-aacb-b8c3b5c96b5d\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.596811 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-combined-ca-bundle\") pod \"4de89973-4899-493b-aacb-b8c3b5c96b5d\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.596847 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-854dv\" (UniqueName: \"kubernetes.io/projected/4de89973-4899-493b-aacb-b8c3b5c96b5d-kube-api-access-854dv\") pod \"4de89973-4899-493b-aacb-b8c3b5c96b5d\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.596892 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-scripts\") pod \"4de89973-4899-493b-aacb-b8c3b5c96b5d\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.596946 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-run-httpd\") pod \"4de89973-4899-493b-aacb-b8c3b5c96b5d\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.596991 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-config-data\") pod \"4de89973-4899-493b-aacb-b8c3b5c96b5d\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.597019 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-log-httpd\") pod \"4de89973-4899-493b-aacb-b8c3b5c96b5d\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.598318 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4de89973-4899-493b-aacb-b8c3b5c96b5d" (UID: "4de89973-4899-493b-aacb-b8c3b5c96b5d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.598354 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4de89973-4899-493b-aacb-b8c3b5c96b5d" (UID: "4de89973-4899-493b-aacb-b8c3b5c96b5d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.600510 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-scripts" (OuterVolumeSpecName: "scripts") pod "4de89973-4899-493b-aacb-b8c3b5c96b5d" (UID: "4de89973-4899-493b-aacb-b8c3b5c96b5d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.618528 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4de89973-4899-493b-aacb-b8c3b5c96b5d-kube-api-access-854dv" (OuterVolumeSpecName: "kube-api-access-854dv") pod "4de89973-4899-493b-aacb-b8c3b5c96b5d" (UID: "4de89973-4899-493b-aacb-b8c3b5c96b5d"). InnerVolumeSpecName "kube-api-access-854dv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.668429 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fc78e86d-494e-417b-8569-b564cdbd069a","Type":"ContainerStarted","Data":"2de277511dd91bf90e302e0eb111d11f5bcbb32d3c5a517d71ff8e3ffc46d240"} Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.672166 4804 generic.go:334] "Generic (PLEG): container finished" podID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerID="86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6" exitCode=0 Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.672218 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4de89973-4899-493b-aacb-b8c3b5c96b5d","Type":"ContainerDied","Data":"86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6"} Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.672244 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4de89973-4899-493b-aacb-b8c3b5c96b5d","Type":"ContainerDied","Data":"34cf4653df9d04a4b888d17573f24b70b1233173f57dd3617dc2eff2bc8c163b"} Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.672284 4804 scope.go:117] "RemoveContainer" containerID="138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.672332 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.694852 4804 scope.go:117] "RemoveContainer" containerID="76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.698512 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4de89973-4899-493b-aacb-b8c3b5c96b5d" (UID: "4de89973-4899-493b-aacb-b8c3b5c96b5d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.698639 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-sg-core-conf-yaml\") pod \"4de89973-4899-493b-aacb-b8c3b5c96b5d\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " Feb 17 13:48:29 crc kubenswrapper[4804]: W0217 13:48:29.698764 4804 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/4de89973-4899-493b-aacb-b8c3b5c96b5d/volumes/kubernetes.io~secret/sg-core-conf-yaml Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.698783 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4de89973-4899-493b-aacb-b8c3b5c96b5d" (UID: "4de89973-4899-493b-aacb-b8c3b5c96b5d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.699098 4804 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.699119 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-854dv\" (UniqueName: \"kubernetes.io/projected/4de89973-4899-493b-aacb-b8c3b5c96b5d-kube-api-access-854dv\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.699131 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.699141 4804 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.699148 4804 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.701075 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4de89973-4899-493b-aacb-b8c3b5c96b5d" (UID: "4de89973-4899-493b-aacb-b8c3b5c96b5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.715501 4804 scope.go:117] "RemoveContainer" containerID="d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.736354 4804 scope.go:117] "RemoveContainer" containerID="86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.737148 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-config-data" (OuterVolumeSpecName: "config-data") pod "4de89973-4899-493b-aacb-b8c3b5c96b5d" (UID: "4de89973-4899-493b-aacb-b8c3b5c96b5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.759330 4804 scope.go:117] "RemoveContainer" containerID="138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285" Feb 17 13:48:29 crc kubenswrapper[4804]: E0217 13:48:29.759726 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285\": container with ID starting with 138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285 not found: ID does not exist" containerID="138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.759756 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285"} err="failed to get container status \"138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285\": rpc error: code = NotFound desc = could not find container \"138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285\": container with ID starting with 138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285 not found: ID does not exist" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.759777 4804 scope.go:117] "RemoveContainer" containerID="76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07" Feb 17 13:48:29 crc kubenswrapper[4804]: E0217 13:48:29.760021 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07\": container with ID starting with 76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07 not found: ID does not exist" containerID="76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.760044 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07"} err="failed to get container status \"76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07\": rpc error: code = NotFound desc = could not find container \"76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07\": container with ID starting with 76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07 not found: ID does not exist" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.760057 4804 scope.go:117] "RemoveContainer" containerID="d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab" Feb 17 13:48:29 crc kubenswrapper[4804]: E0217 13:48:29.760476 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab\": container with ID starting with d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab not found: ID does not exist" containerID="d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.760510 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab"} err="failed to get container status \"d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab\": rpc error: code = NotFound desc = could not find container \"d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab\": container with ID starting with d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab not found: ID does not exist" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.760523 4804 scope.go:117] "RemoveContainer" containerID="86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6" Feb 17 13:48:29 crc kubenswrapper[4804]: E0217 13:48:29.760891 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6\": container with ID starting with 86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6 not found: ID does not exist" containerID="86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.760947 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6"} err="failed to get container status \"86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6\": rpc error: code = NotFound desc = could not find container \"86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6\": container with ID starting with 86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6 not found: ID does not exist" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.801036 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.801064 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.025181 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.033349 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.048367 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:30 crc kubenswrapper[4804]: E0217 13:48:30.048726 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="ceilometer-central-agent" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.048739 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="ceilometer-central-agent" Feb 17 13:48:30 crc kubenswrapper[4804]: E0217 13:48:30.048758 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="sg-core" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.048765 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="sg-core" Feb 17 13:48:30 crc kubenswrapper[4804]: E0217 13:48:30.048773 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="ceilometer-notification-agent" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.048780 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="ceilometer-notification-agent" Feb 17 13:48:30 crc kubenswrapper[4804]: E0217 13:48:30.048787 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="proxy-httpd" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.048793 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="proxy-httpd" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.048978 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="proxy-httpd" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.048993 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="sg-core" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.049005 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="ceilometer-notification-agent" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.049015 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="ceilometer-central-agent" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.050924 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.054492 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.054798 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.066118 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.106667 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-scripts\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.106729 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-config-data\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.106753 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.106795 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-run-httpd\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.106816 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.106861 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-log-httpd\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.106898 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l9t6\" (UniqueName: \"kubernetes.io/projected/0e6284b7-c2bf-491d-a8b8-66390efc3657-kube-api-access-5l9t6\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.208447 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-config-data\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.208492 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.208546 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-run-httpd\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.208584 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.208618 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-log-httpd\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.208679 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l9t6\" (UniqueName: \"kubernetes.io/projected/0e6284b7-c2bf-491d-a8b8-66390efc3657-kube-api-access-5l9t6\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.208740 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-scripts\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.209062 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-run-httpd\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.209463 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-log-httpd\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.213106 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.213624 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-config-data\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.214240 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-scripts\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.217976 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.231279 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l9t6\" (UniqueName: \"kubernetes.io/projected/0e6284b7-c2bf-491d-a8b8-66390efc3657-kube-api-access-5l9t6\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.367514 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.583612 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" path="/var/lib/kubelet/pods/4de89973-4899-493b-aacb-b8c3b5c96b5d/volumes" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.686347 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fc78e86d-494e-417b-8569-b564cdbd069a","Type":"ContainerStarted","Data":"5a9971c09b621119088ea175c0c01a351ea9bd051bc8add3c4b17f4c234c4088"} Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.686433 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.700042 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.700027236 podStartE2EDuration="2.700027236s" podCreationTimestamp="2026-02-17 13:48:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:48:30.699373235 +0000 UTC m=+1384.810792572" watchObservedRunningTime="2026-02-17 13:48:30.700027236 +0000 UTC m=+1384.811446573" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.849303 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:30 crc kubenswrapper[4804]: W0217 13:48:30.851472 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e6284b7_c2bf_491d_a8b8_66390efc3657.slice/crio-2d2e5d5016d0e7547bab751744c5123f906da3f81613bad821f32b24482acee8 WatchSource:0}: Error finding container 2d2e5d5016d0e7547bab751744c5123f906da3f81613bad821f32b24482acee8: Status 404 returned error can't find the container with id 2d2e5d5016d0e7547bab751744c5123f906da3f81613bad821f32b24482acee8 Feb 17 13:48:31 crc kubenswrapper[4804]: I0217 13:48:31.705305 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e6284b7-c2bf-491d-a8b8-66390efc3657","Type":"ContainerStarted","Data":"7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37"} Feb 17 13:48:31 crc kubenswrapper[4804]: I0217 13:48:31.706457 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e6284b7-c2bf-491d-a8b8-66390efc3657","Type":"ContainerStarted","Data":"2d2e5d5016d0e7547bab751744c5123f906da3f81613bad821f32b24482acee8"} Feb 17 13:48:32 crc kubenswrapper[4804]: I0217 13:48:32.713679 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e6284b7-c2bf-491d-a8b8-66390efc3657","Type":"ContainerStarted","Data":"c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607"} Feb 17 13:48:33 crc kubenswrapper[4804]: I0217 13:48:33.725315 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e6284b7-c2bf-491d-a8b8-66390efc3657","Type":"ContainerStarted","Data":"626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10"} Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.070413 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.590697 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-pmp8r"] Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.592388 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.598045 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.598256 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.600713 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pmp8r"] Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.715891 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8hvb\" (UniqueName: \"kubernetes.io/projected/6597adc7-fdae-4de0-99bc-87d9807f38f4-kube-api-access-j8hvb\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.716267 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-config-data\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.716324 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.716413 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-scripts\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.730965 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.732452 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.735306 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.770663 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e6284b7-c2bf-491d-a8b8-66390efc3657","Type":"ContainerStarted","Data":"739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1"} Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.773376 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.841262 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-config-data\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.841347 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.841470 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-scripts\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.841715 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-config-data\") pod \"nova-scheduler-0\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.841812 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8hvb\" (UniqueName: \"kubernetes.io/projected/6597adc7-fdae-4de0-99bc-87d9807f38f4-kube-api-access-j8hvb\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.842086 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.842162 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnbnx\" (UniqueName: \"kubernetes.io/projected/43796f1c-9838-40a1-9829-f878c2a7f076-kube-api-access-hnbnx\") pod \"nova-scheduler-0\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.853006 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-config-data\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.856148 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-scripts\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.866847 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.886068 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.888456 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.905078 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.917970 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8hvb\" (UniqueName: \"kubernetes.io/projected/6597adc7-fdae-4de0-99bc-87d9807f38f4-kube-api-access-j8hvb\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.922609 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.940627 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.944126 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.944194 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.944235 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnbnx\" (UniqueName: \"kubernetes.io/projected/43796f1c-9838-40a1-9829-f878c2a7f076-kube-api-access-hnbnx\") pod \"nova-scheduler-0\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.944262 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-config-data\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.944323 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pk2j\" (UniqueName: \"kubernetes.io/projected/d12412cb-bde4-4c84-bd52-42ac9cb6232c-kube-api-access-5pk2j\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.944341 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d12412cb-bde4-4c84-bd52-42ac9cb6232c-logs\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.944385 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-config-data\") pod \"nova-scheduler-0\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.948803 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-config-data\") pod \"nova-scheduler-0\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.960388 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.974040 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnbnx\" (UniqueName: \"kubernetes.io/projected/43796f1c-9838-40a1-9829-f878c2a7f076-kube-api-access-hnbnx\") pod \"nova-scheduler-0\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.974684 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.988858 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.619131917 podStartE2EDuration="4.988841538s" podCreationTimestamp="2026-02-17 13:48:30 +0000 UTC" firstStartedPulling="2026-02-17 13:48:30.854255724 +0000 UTC m=+1384.965675101" lastFinishedPulling="2026-02-17 13:48:34.223965385 +0000 UTC m=+1388.335384722" observedRunningTime="2026-02-17 13:48:34.852558885 +0000 UTC m=+1388.963978222" watchObservedRunningTime="2026-02-17 13:48:34.988841538 +0000 UTC m=+1389.100260875" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.056750 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.057559 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-config-data\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.057685 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pk2j\" (UniqueName: \"kubernetes.io/projected/d12412cb-bde4-4c84-bd52-42ac9cb6232c-kube-api-access-5pk2j\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.057708 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d12412cb-bde4-4c84-bd52-42ac9cb6232c-logs\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.057921 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.063250 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.066150 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-config-data\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.066825 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.067819 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d12412cb-bde4-4c84-bd52-42ac9cb6232c-logs\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.080716 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.086566 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.104261 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.113660 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pk2j\" (UniqueName: \"kubernetes.io/projected/d12412cb-bde4-4c84-bd52-42ac9cb6232c-kube-api-access-5pk2j\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.122530 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-nm74r"] Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.134349 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.146309 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-nm74r"] Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.159977 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.160380 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gptwq\" (UniqueName: \"kubernetes.io/projected/80e4a011-e72b-4fea-b6cb-15425d5d5940-kube-api-access-gptwq\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.160477 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-config-data\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.160548 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80e4a011-e72b-4fea-b6cb-15425d5d5940-logs\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.161001 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.163402 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.168293 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.198180 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.262427 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gptwq\" (UniqueName: \"kubernetes.io/projected/80e4a011-e72b-4fea-b6cb-15425d5d5940-kube-api-access-gptwq\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.262502 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.262550 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-config-data\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.262581 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqn6x\" (UniqueName: \"kubernetes.io/projected/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-kube-api-access-dqn6x\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.262673 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.262718 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-config\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.262772 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80e4a011-e72b-4fea-b6cb-15425d5d5940-logs\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.262823 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.262883 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.263453 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.263485 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.263521 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8n6q\" (UniqueName: \"kubernetes.io/projected/aeb819ef-7656-4054-baa2-02efb705872d-kube-api-access-b8n6q\") pod \"nova-cell1-novncproxy-0\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.263586 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-svc\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.267964 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80e4a011-e72b-4fea-b6cb-15425d5d5940-logs\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.274744 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.275174 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.277431 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-config-data\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.289778 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gptwq\" (UniqueName: \"kubernetes.io/projected/80e4a011-e72b-4fea-b6cb-15425d5d5940-kube-api-access-gptwq\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.366535 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqn6x\" (UniqueName: \"kubernetes.io/projected/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-kube-api-access-dqn6x\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.366597 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.366638 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-config\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.366679 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.366716 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.366762 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.366804 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8n6q\" (UniqueName: \"kubernetes.io/projected/aeb819ef-7656-4054-baa2-02efb705872d-kube-api-access-b8n6q\") pod \"nova-cell1-novncproxy-0\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.366840 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-svc\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.366994 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.368813 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.369181 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.369745 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-svc\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.370112 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.370358 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-config\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.378183 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.386830 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.393024 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqn6x\" (UniqueName: \"kubernetes.io/projected/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-kube-api-access-dqn6x\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.388288 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8n6q\" (UniqueName: \"kubernetes.io/projected/aeb819ef-7656-4054-baa2-02efb705872d-kube-api-access-b8n6q\") pod \"nova-cell1-novncproxy-0\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.440688 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.474674 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.532525 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.720714 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pmp8r"] Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.741303 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.830559 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.836536 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pmp8r" event={"ID":"6597adc7-fdae-4de0-99bc-87d9807f38f4","Type":"ContainerStarted","Data":"b65e726ad0fd58fbc98c718204a9a6619e848272b9dfc249b9a1897ff310c04a"} Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.839502 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"43796f1c-9838-40a1-9829-f878c2a7f076","Type":"ContainerStarted","Data":"76fb016395e0231c3d8a7ae1865fe7cb74985c931d1b06a2d2e7c2491c7f5dbe"} Feb 17 13:48:35 crc kubenswrapper[4804]: W0217 13:48:35.849542 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd12412cb_bde4_4c84_bd52_42ac9cb6232c.slice/crio-4e7bd3339c4a0a6d3bb2de47fcc59da6af0ee83f739c986543db43b9bac674f4 WatchSource:0}: Error finding container 4e7bd3339c4a0a6d3bb2de47fcc59da6af0ee83f739c986543db43b9bac674f4: Status 404 returned error can't find the container with id 4e7bd3339c4a0a6d3bb2de47fcc59da6af0ee83f739c986543db43b9bac674f4 Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.258072 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-nm74r"] Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.339885 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.363349 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 13:48:36 crc kubenswrapper[4804]: W0217 13:48:36.371325 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80e4a011_e72b_4fea_b6cb_15425d5d5940.slice/crio-f4250a31bbaa8fb75b13d25a5ac1d90d53745e9ab78a95f6bcc9ab9e9e16fb06 WatchSource:0}: Error finding container f4250a31bbaa8fb75b13d25a5ac1d90d53745e9ab78a95f6bcc9ab9e9e16fb06: Status 404 returned error can't find the container with id f4250a31bbaa8fb75b13d25a5ac1d90d53745e9ab78a95f6bcc9ab9e9e16fb06 Feb 17 13:48:36 crc kubenswrapper[4804]: W0217 13:48:36.384623 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeb819ef_7656_4054_baa2_02efb705872d.slice/crio-4ffce2ca7928c17f5cf87a7c53dec619e957c317bf58abe39325d5edeb55c199 WatchSource:0}: Error finding container 4ffce2ca7928c17f5cf87a7c53dec619e957c317bf58abe39325d5edeb55c199: Status 404 returned error can't find the container with id 4ffce2ca7928c17f5cf87a7c53dec619e957c317bf58abe39325d5edeb55c199 Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.412987 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wq5kj"] Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.414228 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.417120 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.417397 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.438734 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wq5kj"] Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.498679 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.498811 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-scripts\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.498864 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-config-data\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.498914 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smxf4\" (UniqueName: \"kubernetes.io/projected/c11e165e-2605-470a-a865-230b274ce8d3-kube-api-access-smxf4\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.599999 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-scripts\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.600057 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-config-data\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.600098 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smxf4\" (UniqueName: \"kubernetes.io/projected/c11e165e-2605-470a-a865-230b274ce8d3-kube-api-access-smxf4\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.600158 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.606938 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-scripts\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.636147 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-config-data\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.640255 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smxf4\" (UniqueName: \"kubernetes.io/projected/c11e165e-2605-470a-a865-230b274ce8d3-kube-api-access-smxf4\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.645693 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.873877 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pmp8r" event={"ID":"6597adc7-fdae-4de0-99bc-87d9807f38f4","Type":"ContainerStarted","Data":"29efb5e0a9decba15d04c2ad76b8438da8424bb8f92bf46c981df4cb056e18f6"} Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.879494 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"80e4a011-e72b-4fea-b6cb-15425d5d5940","Type":"ContainerStarted","Data":"f4250a31bbaa8fb75b13d25a5ac1d90d53745e9ab78a95f6bcc9ab9e9e16fb06"} Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.880987 4804 generic.go:334] "Generic (PLEG): container finished" podID="59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" containerID="11ccb1e04351ba07a549d61417358dec2218d53f646147b72efa0ee2b3fc4654" exitCode=0 Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.881075 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" event={"ID":"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70","Type":"ContainerDied","Data":"11ccb1e04351ba07a549d61417358dec2218d53f646147b72efa0ee2b3fc4654"} Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.881097 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" event={"ID":"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70","Type":"ContainerStarted","Data":"9ac5534fef55ed02d86af4d8912cb72f23f77c2e384ce39f866abb0e39f803e5"} Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.882402 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.886611 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"aeb819ef-7656-4054-baa2-02efb705872d","Type":"ContainerStarted","Data":"4ffce2ca7928c17f5cf87a7c53dec619e957c317bf58abe39325d5edeb55c199"} Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.890340 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d12412cb-bde4-4c84-bd52-42ac9cb6232c","Type":"ContainerStarted","Data":"4e7bd3339c4a0a6d3bb2de47fcc59da6af0ee83f739c986543db43b9bac674f4"} Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.902087 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-pmp8r" podStartSLOduration=2.902068878 podStartE2EDuration="2.902068878s" podCreationTimestamp="2026-02-17 13:48:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:48:36.895499222 +0000 UTC m=+1391.006918559" watchObservedRunningTime="2026-02-17 13:48:36.902068878 +0000 UTC m=+1391.013488215" Feb 17 13:48:37 crc kubenswrapper[4804]: I0217 13:48:37.497873 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wq5kj"] Feb 17 13:48:37 crc kubenswrapper[4804]: I0217 13:48:37.913830 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" event={"ID":"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70","Type":"ContainerStarted","Data":"80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2"} Feb 17 13:48:37 crc kubenswrapper[4804]: I0217 13:48:37.914543 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:37 crc kubenswrapper[4804]: I0217 13:48:37.927913 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wq5kj" event={"ID":"c11e165e-2605-470a-a865-230b274ce8d3","Type":"ContainerStarted","Data":"e2acfd4d07f2a376a865d19f0462c02776c9702874f6443998a2d4c2b54946eb"} Feb 17 13:48:37 crc kubenswrapper[4804]: I0217 13:48:37.946665 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" podStartSLOduration=3.946644933 podStartE2EDuration="3.946644933s" podCreationTimestamp="2026-02-17 13:48:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:48:37.939693004 +0000 UTC m=+1392.051112341" watchObservedRunningTime="2026-02-17 13:48:37.946644933 +0000 UTC m=+1392.058064270" Feb 17 13:48:38 crc kubenswrapper[4804]: I0217 13:48:38.827377 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 13:48:38 crc kubenswrapper[4804]: I0217 13:48:38.845131 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:40 crc kubenswrapper[4804]: I0217 13:48:40.972324 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"aeb819ef-7656-4054-baa2-02efb705872d","Type":"ContainerStarted","Data":"12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87"} Feb 17 13:48:40 crc kubenswrapper[4804]: I0217 13:48:40.972785 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="aeb819ef-7656-4054-baa2-02efb705872d" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87" gracePeriod=30 Feb 17 13:48:40 crc kubenswrapper[4804]: I0217 13:48:40.994727 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.7469515329999998 podStartE2EDuration="5.994704775s" podCreationTimestamp="2026-02-17 13:48:35 +0000 UTC" firstStartedPulling="2026-02-17 13:48:36.387905206 +0000 UTC m=+1390.499324553" lastFinishedPulling="2026-02-17 13:48:40.635658458 +0000 UTC m=+1394.747077795" observedRunningTime="2026-02-17 13:48:40.991157893 +0000 UTC m=+1395.102577240" watchObservedRunningTime="2026-02-17 13:48:40.994704775 +0000 UTC m=+1395.106124112" Feb 17 13:48:41 crc kubenswrapper[4804]: I0217 13:48:41.986945 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d12412cb-bde4-4c84-bd52-42ac9cb6232c","Type":"ContainerStarted","Data":"68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d"} Feb 17 13:48:41 crc kubenswrapper[4804]: I0217 13:48:41.987323 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d12412cb-bde4-4c84-bd52-42ac9cb6232c","Type":"ContainerStarted","Data":"4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c"} Feb 17 13:48:41 crc kubenswrapper[4804]: I0217 13:48:41.987047 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d12412cb-bde4-4c84-bd52-42ac9cb6232c" containerName="nova-metadata-log" containerID="cri-o://4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c" gracePeriod=30 Feb 17 13:48:41 crc kubenswrapper[4804]: I0217 13:48:41.987353 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d12412cb-bde4-4c84-bd52-42ac9cb6232c" containerName="nova-metadata-metadata" containerID="cri-o://68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d" gracePeriod=30 Feb 17 13:48:41 crc kubenswrapper[4804]: I0217 13:48:41.996004 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"80e4a011-e72b-4fea-b6cb-15425d5d5940","Type":"ContainerStarted","Data":"43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01"} Feb 17 13:48:41 crc kubenswrapper[4804]: I0217 13:48:41.996050 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"80e4a011-e72b-4fea-b6cb-15425d5d5940","Type":"ContainerStarted","Data":"23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f"} Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.001015 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"43796f1c-9838-40a1-9829-f878c2a7f076","Type":"ContainerStarted","Data":"def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f"} Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.005575 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wq5kj" event={"ID":"c11e165e-2605-470a-a865-230b274ce8d3","Type":"ContainerStarted","Data":"24aef71ff922a8ddea4d7c3429161120ea76c5281b5a5f51b9b913d40e9cb137"} Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.017915 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.283978363 podStartE2EDuration="8.017896947s" podCreationTimestamp="2026-02-17 13:48:34 +0000 UTC" firstStartedPulling="2026-02-17 13:48:35.873859688 +0000 UTC m=+1389.985279025" lastFinishedPulling="2026-02-17 13:48:40.607778272 +0000 UTC m=+1394.719197609" observedRunningTime="2026-02-17 13:48:42.012963592 +0000 UTC m=+1396.124382929" watchObservedRunningTime="2026-02-17 13:48:42.017896947 +0000 UTC m=+1396.129316284" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.038555 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-wq5kj" podStartSLOduration=6.038535136 podStartE2EDuration="6.038535136s" podCreationTimestamp="2026-02-17 13:48:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:48:42.027766317 +0000 UTC m=+1396.139185654" watchObservedRunningTime="2026-02-17 13:48:42.038535136 +0000 UTC m=+1396.149954493" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.056015 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.827953354 podStartE2EDuration="8.055997595s" podCreationTimestamp="2026-02-17 13:48:34 +0000 UTC" firstStartedPulling="2026-02-17 13:48:36.378587524 +0000 UTC m=+1390.490006861" lastFinishedPulling="2026-02-17 13:48:40.606631765 +0000 UTC m=+1394.718051102" observedRunningTime="2026-02-17 13:48:42.050718149 +0000 UTC m=+1396.162137496" watchObservedRunningTime="2026-02-17 13:48:42.055997595 +0000 UTC m=+1396.167416932" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.071632 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.236333265 podStartE2EDuration="8.071610295s" podCreationTimestamp="2026-02-17 13:48:34 +0000 UTC" firstStartedPulling="2026-02-17 13:48:35.771357196 +0000 UTC m=+1389.882776533" lastFinishedPulling="2026-02-17 13:48:40.606634226 +0000 UTC m=+1394.718053563" observedRunningTime="2026-02-17 13:48:42.062715176 +0000 UTC m=+1396.174134503" watchObservedRunningTime="2026-02-17 13:48:42.071610295 +0000 UTC m=+1396.183029632" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.560968 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.672667 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-config-data\") pod \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.672969 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d12412cb-bde4-4c84-bd52-42ac9cb6232c-logs\") pod \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.672999 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pk2j\" (UniqueName: \"kubernetes.io/projected/d12412cb-bde4-4c84-bd52-42ac9cb6232c-kube-api-access-5pk2j\") pod \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.673018 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-combined-ca-bundle\") pod \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.676743 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d12412cb-bde4-4c84-bd52-42ac9cb6232c-logs" (OuterVolumeSpecName: "logs") pod "d12412cb-bde4-4c84-bd52-42ac9cb6232c" (UID: "d12412cb-bde4-4c84-bd52-42ac9cb6232c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.679416 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d12412cb-bde4-4c84-bd52-42ac9cb6232c-kube-api-access-5pk2j" (OuterVolumeSpecName: "kube-api-access-5pk2j") pod "d12412cb-bde4-4c84-bd52-42ac9cb6232c" (UID: "d12412cb-bde4-4c84-bd52-42ac9cb6232c"). InnerVolumeSpecName "kube-api-access-5pk2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.704529 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d12412cb-bde4-4c84-bd52-42ac9cb6232c" (UID: "d12412cb-bde4-4c84-bd52-42ac9cb6232c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.708623 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-config-data" (OuterVolumeSpecName: "config-data") pod "d12412cb-bde4-4c84-bd52-42ac9cb6232c" (UID: "d12412cb-bde4-4c84-bd52-42ac9cb6232c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.774869 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d12412cb-bde4-4c84-bd52-42ac9cb6232c-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.775184 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pk2j\" (UniqueName: \"kubernetes.io/projected/d12412cb-bde4-4c84-bd52-42ac9cb6232c-kube-api-access-5pk2j\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.775218 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.775232 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.014545 4804 generic.go:334] "Generic (PLEG): container finished" podID="d12412cb-bde4-4c84-bd52-42ac9cb6232c" containerID="68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d" exitCode=0 Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.014575 4804 generic.go:334] "Generic (PLEG): container finished" podID="d12412cb-bde4-4c84-bd52-42ac9cb6232c" containerID="4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c" exitCode=143 Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.015425 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.018017 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d12412cb-bde4-4c84-bd52-42ac9cb6232c","Type":"ContainerDied","Data":"68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d"} Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.018052 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d12412cb-bde4-4c84-bd52-42ac9cb6232c","Type":"ContainerDied","Data":"4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c"} Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.018063 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d12412cb-bde4-4c84-bd52-42ac9cb6232c","Type":"ContainerDied","Data":"4e7bd3339c4a0a6d3bb2de47fcc59da6af0ee83f739c986543db43b9bac674f4"} Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.018077 4804 scope.go:117] "RemoveContainer" containerID="68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.067919 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.074099 4804 scope.go:117] "RemoveContainer" containerID="4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.092891 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.112063 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:43 crc kubenswrapper[4804]: E0217 13:48:43.112457 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d12412cb-bde4-4c84-bd52-42ac9cb6232c" containerName="nova-metadata-metadata" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.112473 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="d12412cb-bde4-4c84-bd52-42ac9cb6232c" containerName="nova-metadata-metadata" Feb 17 13:48:43 crc kubenswrapper[4804]: E0217 13:48:43.112519 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d12412cb-bde4-4c84-bd52-42ac9cb6232c" containerName="nova-metadata-log" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.112525 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="d12412cb-bde4-4c84-bd52-42ac9cb6232c" containerName="nova-metadata-log" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.112726 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="d12412cb-bde4-4c84-bd52-42ac9cb6232c" containerName="nova-metadata-log" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.112752 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="d12412cb-bde4-4c84-bd52-42ac9cb6232c" containerName="nova-metadata-metadata" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.114055 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.126693 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.129888 4804 scope.go:117] "RemoveContainer" containerID="68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d" Feb 17 13:48:43 crc kubenswrapper[4804]: E0217 13:48:43.138418 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d\": container with ID starting with 68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d not found: ID does not exist" containerID="68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.138479 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d"} err="failed to get container status \"68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d\": rpc error: code = NotFound desc = could not find container \"68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d\": container with ID starting with 68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d not found: ID does not exist" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.138510 4804 scope.go:117] "RemoveContainer" containerID="4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c" Feb 17 13:48:43 crc kubenswrapper[4804]: E0217 13:48:43.139880 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c\": container with ID starting with 4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c not found: ID does not exist" containerID="4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.139921 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c"} err="failed to get container status \"4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c\": rpc error: code = NotFound desc = could not find container \"4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c\": container with ID starting with 4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c not found: ID does not exist" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.139951 4804 scope.go:117] "RemoveContainer" containerID="68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.140187 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d"} err="failed to get container status \"68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d\": rpc error: code = NotFound desc = could not find container \"68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d\": container with ID starting with 68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d not found: ID does not exist" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.140226 4804 scope.go:117] "RemoveContainer" containerID="4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.140423 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c"} err="failed to get container status \"4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c\": rpc error: code = NotFound desc = could not find container \"4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c\": container with ID starting with 4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c not found: ID does not exist" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.154860 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.182438 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.213104 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgqgh\" (UniqueName: \"kubernetes.io/projected/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-kube-api-access-fgqgh\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.213224 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-config-data\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.213248 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-logs\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.213270 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.213357 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.315737 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-config-data\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.316004 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-logs\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.316029 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.316103 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.316246 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgqgh\" (UniqueName: \"kubernetes.io/projected/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-kube-api-access-fgqgh\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.317156 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-logs\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.320613 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-config-data\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.321528 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.334090 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgqgh\" (UniqueName: \"kubernetes.io/projected/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-kube-api-access-fgqgh\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.334256 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.477980 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:48:44 crc kubenswrapper[4804]: I0217 13:48:44.586662 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d12412cb-bde4-4c84-bd52-42ac9cb6232c" path="/var/lib/kubelet/pods/d12412cb-bde4-4c84-bd52-42ac9cb6232c/volumes" Feb 17 13:48:44 crc kubenswrapper[4804]: I0217 13:48:44.672620 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.081684 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.082031 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.128238 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.266854 4804 generic.go:334] "Generic (PLEG): container finished" podID="6597adc7-fdae-4de0-99bc-87d9807f38f4" containerID="29efb5e0a9decba15d04c2ad76b8438da8424bb8f92bf46c981df4cb056e18f6" exitCode=0 Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.266960 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pmp8r" event={"ID":"6597adc7-fdae-4de0-99bc-87d9807f38f4","Type":"ContainerDied","Data":"29efb5e0a9decba15d04c2ad76b8438da8424bb8f92bf46c981df4cb056e18f6"} Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.271109 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21","Type":"ContainerStarted","Data":"1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c"} Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.271289 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21","Type":"ContainerStarted","Data":"316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d"} Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.271370 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21","Type":"ContainerStarted","Data":"981aac18a07ba416bc920d67be4b035029798cd9a4dd9ffa83de28fefc1ded2e"} Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.310676 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.31065631 podStartE2EDuration="2.31065631s" podCreationTimestamp="2026-02-17 13:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:48:45.307529961 +0000 UTC m=+1399.418949318" watchObservedRunningTime="2026-02-17 13:48:45.31065631 +0000 UTC m=+1399.422075647" Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.313175 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.442401 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.442460 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.477257 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.539349 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.547343 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-mtrxj"] Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.547823 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" podUID="737ac1d8-ad22-4a56-b203-eb2212949fb6" containerName="dnsmasq-dns" containerID="cri-o://be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7" gracePeriod=10 Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.097566 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.277445 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9ztf\" (UniqueName: \"kubernetes.io/projected/737ac1d8-ad22-4a56-b203-eb2212949fb6-kube-api-access-x9ztf\") pod \"737ac1d8-ad22-4a56-b203-eb2212949fb6\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.277591 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-swift-storage-0\") pod \"737ac1d8-ad22-4a56-b203-eb2212949fb6\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.277640 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-config\") pod \"737ac1d8-ad22-4a56-b203-eb2212949fb6\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.277667 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-nb\") pod \"737ac1d8-ad22-4a56-b203-eb2212949fb6\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.277751 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-sb\") pod \"737ac1d8-ad22-4a56-b203-eb2212949fb6\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.277781 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-svc\") pod \"737ac1d8-ad22-4a56-b203-eb2212949fb6\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.285288 4804 generic.go:334] "Generic (PLEG): container finished" podID="737ac1d8-ad22-4a56-b203-eb2212949fb6" containerID="be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7" exitCode=0 Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.285404 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.285416 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" event={"ID":"737ac1d8-ad22-4a56-b203-eb2212949fb6","Type":"ContainerDied","Data":"be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7"} Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.285496 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" event={"ID":"737ac1d8-ad22-4a56-b203-eb2212949fb6","Type":"ContainerDied","Data":"2d996d992d2a3254b879bd96b12e636e65525644b7181f7f3f61897c257c69b0"} Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.285521 4804 scope.go:117] "RemoveContainer" containerID="be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.297556 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/737ac1d8-ad22-4a56-b203-eb2212949fb6-kube-api-access-x9ztf" (OuterVolumeSpecName: "kube-api-access-x9ztf") pod "737ac1d8-ad22-4a56-b203-eb2212949fb6" (UID: "737ac1d8-ad22-4a56-b203-eb2212949fb6"). InnerVolumeSpecName "kube-api-access-x9ztf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.355938 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "737ac1d8-ad22-4a56-b203-eb2212949fb6" (UID: "737ac1d8-ad22-4a56-b203-eb2212949fb6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.360637 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "737ac1d8-ad22-4a56-b203-eb2212949fb6" (UID: "737ac1d8-ad22-4a56-b203-eb2212949fb6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.365675 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-config" (OuterVolumeSpecName: "config") pod "737ac1d8-ad22-4a56-b203-eb2212949fb6" (UID: "737ac1d8-ad22-4a56-b203-eb2212949fb6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.379829 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "737ac1d8-ad22-4a56-b203-eb2212949fb6" (UID: "737ac1d8-ad22-4a56-b203-eb2212949fb6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.382283 4804 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.382318 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.382332 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.382346 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.382358 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9ztf\" (UniqueName: \"kubernetes.io/projected/737ac1d8-ad22-4a56-b203-eb2212949fb6-kube-api-access-x9ztf\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.410775 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "737ac1d8-ad22-4a56-b203-eb2212949fb6" (UID: "737ac1d8-ad22-4a56-b203-eb2212949fb6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.484390 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.488442 4804 scope.go:117] "RemoveContainer" containerID="694f46d3a17de98217359c48adef8a435392ec51f1c7919012624d49fd6b0165" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.507702 4804 scope.go:117] "RemoveContainer" containerID="be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7" Feb 17 13:48:46 crc kubenswrapper[4804]: E0217 13:48:46.512171 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7\": container with ID starting with be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7 not found: ID does not exist" containerID="be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.512245 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7"} err="failed to get container status \"be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7\": rpc error: code = NotFound desc = could not find container \"be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7\": container with ID starting with be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7 not found: ID does not exist" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.512276 4804 scope.go:117] "RemoveContainer" containerID="694f46d3a17de98217359c48adef8a435392ec51f1c7919012624d49fd6b0165" Feb 17 13:48:46 crc kubenswrapper[4804]: E0217 13:48:46.515495 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"694f46d3a17de98217359c48adef8a435392ec51f1c7919012624d49fd6b0165\": container with ID starting with 694f46d3a17de98217359c48adef8a435392ec51f1c7919012624d49fd6b0165 not found: ID does not exist" containerID="694f46d3a17de98217359c48adef8a435392ec51f1c7919012624d49fd6b0165" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.515529 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"694f46d3a17de98217359c48adef8a435392ec51f1c7919012624d49fd6b0165"} err="failed to get container status \"694f46d3a17de98217359c48adef8a435392ec51f1c7919012624d49fd6b0165\": rpc error: code = NotFound desc = could not find container \"694f46d3a17de98217359c48adef8a435392ec51f1c7919012624d49fd6b0165\": container with ID starting with 694f46d3a17de98217359c48adef8a435392ec51f1c7919012624d49fd6b0165 not found: ID does not exist" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.524436 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.524436 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.621300 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.627051 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-mtrxj"] Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.636655 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-mtrxj"] Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.688167 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-scripts\") pod \"6597adc7-fdae-4de0-99bc-87d9807f38f4\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.688244 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-config-data\") pod \"6597adc7-fdae-4de0-99bc-87d9807f38f4\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.688273 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8hvb\" (UniqueName: \"kubernetes.io/projected/6597adc7-fdae-4de0-99bc-87d9807f38f4-kube-api-access-j8hvb\") pod \"6597adc7-fdae-4de0-99bc-87d9807f38f4\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.688361 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-combined-ca-bundle\") pod \"6597adc7-fdae-4de0-99bc-87d9807f38f4\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.692152 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6597adc7-fdae-4de0-99bc-87d9807f38f4-kube-api-access-j8hvb" (OuterVolumeSpecName: "kube-api-access-j8hvb") pod "6597adc7-fdae-4de0-99bc-87d9807f38f4" (UID: "6597adc7-fdae-4de0-99bc-87d9807f38f4"). InnerVolumeSpecName "kube-api-access-j8hvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.692745 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-scripts" (OuterVolumeSpecName: "scripts") pod "6597adc7-fdae-4de0-99bc-87d9807f38f4" (UID: "6597adc7-fdae-4de0-99bc-87d9807f38f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.714759 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-config-data" (OuterVolumeSpecName: "config-data") pod "6597adc7-fdae-4de0-99bc-87d9807f38f4" (UID: "6597adc7-fdae-4de0-99bc-87d9807f38f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.715836 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6597adc7-fdae-4de0-99bc-87d9807f38f4" (UID: "6597adc7-fdae-4de0-99bc-87d9807f38f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.790074 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.790354 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.790439 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.790514 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8hvb\" (UniqueName: \"kubernetes.io/projected/6597adc7-fdae-4de0-99bc-87d9807f38f4-kube-api-access-j8hvb\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:47 crc kubenswrapper[4804]: I0217 13:48:47.299524 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pmp8r" event={"ID":"6597adc7-fdae-4de0-99bc-87d9807f38f4","Type":"ContainerDied","Data":"b65e726ad0fd58fbc98c718204a9a6619e848272b9dfc249b9a1897ff310c04a"} Feb 17 13:48:47 crc kubenswrapper[4804]: I0217 13:48:47.300856 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b65e726ad0fd58fbc98c718204a9a6619e848272b9dfc249b9a1897ff310c04a" Feb 17 13:48:47 crc kubenswrapper[4804]: I0217 13:48:47.299604 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:47 crc kubenswrapper[4804]: I0217 13:48:47.405038 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:48:47 crc kubenswrapper[4804]: I0217 13:48:47.405313 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerName="nova-api-log" containerID="cri-o://23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f" gracePeriod=30 Feb 17 13:48:47 crc kubenswrapper[4804]: I0217 13:48:47.405764 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerName="nova-api-api" containerID="cri-o://43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01" gracePeriod=30 Feb 17 13:48:47 crc kubenswrapper[4804]: I0217 13:48:47.423930 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:48:47 crc kubenswrapper[4804]: I0217 13:48:47.424415 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="43796f1c-9838-40a1-9829-f878c2a7f076" containerName="nova-scheduler-scheduler" containerID="cri-o://def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f" gracePeriod=30 Feb 17 13:48:47 crc kubenswrapper[4804]: I0217 13:48:47.441109 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:47 crc kubenswrapper[4804]: I0217 13:48:47.442320 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" containerName="nova-metadata-log" containerID="cri-o://316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d" gracePeriod=30 Feb 17 13:48:47 crc kubenswrapper[4804]: I0217 13:48:47.442360 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" containerName="nova-metadata-metadata" containerID="cri-o://1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c" gracePeriod=30 Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.062080 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.213284 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-logs\") pod \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.213376 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-nova-metadata-tls-certs\") pod \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.213505 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgqgh\" (UniqueName: \"kubernetes.io/projected/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-kube-api-access-fgqgh\") pod \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.213549 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-combined-ca-bundle\") pod \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.213573 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-config-data\") pod \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.213737 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-logs" (OuterVolumeSpecName: "logs") pod "63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" (UID: "63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.214413 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.220478 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-kube-api-access-fgqgh" (OuterVolumeSpecName: "kube-api-access-fgqgh") pod "63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" (UID: "63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21"). InnerVolumeSpecName "kube-api-access-fgqgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.241270 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" (UID: "63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.250601 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-config-data" (OuterVolumeSpecName: "config-data") pod "63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" (UID: "63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.278631 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" (UID: "63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.317056 4804 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.317108 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgqgh\" (UniqueName: \"kubernetes.io/projected/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-kube-api-access-fgqgh\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.317127 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.317139 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.331187 4804 generic.go:334] "Generic (PLEG): container finished" podID="63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" containerID="1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c" exitCode=0 Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.331236 4804 generic.go:334] "Generic (PLEG): container finished" podID="63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" containerID="316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d" exitCode=143 Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.331274 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.331287 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21","Type":"ContainerDied","Data":"1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c"} Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.331375 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21","Type":"ContainerDied","Data":"316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d"} Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.331392 4804 scope.go:117] "RemoveContainer" containerID="1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.331397 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21","Type":"ContainerDied","Data":"981aac18a07ba416bc920d67be4b035029798cd9a4dd9ffa83de28fefc1ded2e"} Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.333455 4804 generic.go:334] "Generic (PLEG): container finished" podID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerID="23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f" exitCode=143 Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.333496 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"80e4a011-e72b-4fea-b6cb-15425d5d5940","Type":"ContainerDied","Data":"23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f"} Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.376905 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.378353 4804 scope.go:117] "RemoveContainer" containerID="316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.392469 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.411031 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:48 crc kubenswrapper[4804]: E0217 13:48:48.411655 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737ac1d8-ad22-4a56-b203-eb2212949fb6" containerName="dnsmasq-dns" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.411693 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="737ac1d8-ad22-4a56-b203-eb2212949fb6" containerName="dnsmasq-dns" Feb 17 13:48:48 crc kubenswrapper[4804]: E0217 13:48:48.411708 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" containerName="nova-metadata-metadata" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.411717 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" containerName="nova-metadata-metadata" Feb 17 13:48:48 crc kubenswrapper[4804]: E0217 13:48:48.411740 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737ac1d8-ad22-4a56-b203-eb2212949fb6" containerName="init" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.411746 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="737ac1d8-ad22-4a56-b203-eb2212949fb6" containerName="init" Feb 17 13:48:48 crc kubenswrapper[4804]: E0217 13:48:48.411776 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" containerName="nova-metadata-log" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.411783 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" containerName="nova-metadata-log" Feb 17 13:48:48 crc kubenswrapper[4804]: E0217 13:48:48.411806 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6597adc7-fdae-4de0-99bc-87d9807f38f4" containerName="nova-manage" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.411813 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="6597adc7-fdae-4de0-99bc-87d9807f38f4" containerName="nova-manage" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.412061 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="737ac1d8-ad22-4a56-b203-eb2212949fb6" containerName="dnsmasq-dns" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.412096 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" containerName="nova-metadata-log" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.412112 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" containerName="nova-metadata-metadata" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.412123 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="6597adc7-fdae-4de0-99bc-87d9807f38f4" containerName="nova-manage" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.413650 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.417056 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.419242 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.419383 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.425169 4804 scope.go:117] "RemoveContainer" containerID="1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c" Feb 17 13:48:48 crc kubenswrapper[4804]: E0217 13:48:48.426756 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c\": container with ID starting with 1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c not found: ID does not exist" containerID="1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.426788 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c"} err="failed to get container status \"1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c\": rpc error: code = NotFound desc = could not find container \"1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c\": container with ID starting with 1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c not found: ID does not exist" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.426814 4804 scope.go:117] "RemoveContainer" containerID="316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d" Feb 17 13:48:48 crc kubenswrapper[4804]: E0217 13:48:48.430885 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d\": container with ID starting with 316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d not found: ID does not exist" containerID="316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.431088 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d"} err="failed to get container status \"316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d\": rpc error: code = NotFound desc = could not find container \"316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d\": container with ID starting with 316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d not found: ID does not exist" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.431119 4804 scope.go:117] "RemoveContainer" containerID="1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.431982 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c"} err="failed to get container status \"1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c\": rpc error: code = NotFound desc = could not find container \"1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c\": container with ID starting with 1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c not found: ID does not exist" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.432016 4804 scope.go:117] "RemoveContainer" containerID="316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.432374 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d"} err="failed to get container status \"316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d\": rpc error: code = NotFound desc = could not find container \"316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d\": container with ID starting with 316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d not found: ID does not exist" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.520487 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-config-data\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.520565 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa87191a-671d-43c8-b8c2-e5e07a54af02-logs\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.520620 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.520704 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.520773 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqplt\" (UniqueName: \"kubernetes.io/projected/aa87191a-671d-43c8-b8c2-e5e07a54af02-kube-api-access-rqplt\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.586408 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" path="/var/lib/kubelet/pods/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21/volumes" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.587109 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="737ac1d8-ad22-4a56-b203-eb2212949fb6" path="/var/lib/kubelet/pods/737ac1d8-ad22-4a56-b203-eb2212949fb6/volumes" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.623269 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-config-data\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.623392 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa87191a-671d-43c8-b8c2-e5e07a54af02-logs\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.623491 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.623595 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.623706 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqplt\" (UniqueName: \"kubernetes.io/projected/aa87191a-671d-43c8-b8c2-e5e07a54af02-kube-api-access-rqplt\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.623895 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa87191a-671d-43c8-b8c2-e5e07a54af02-logs\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.626854 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.626956 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.627218 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-config-data\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.640920 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqplt\" (UniqueName: \"kubernetes.io/projected/aa87191a-671d-43c8-b8c2-e5e07a54af02-kube-api-access-rqplt\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.735776 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:48:49 crc kubenswrapper[4804]: I0217 13:48:49.192742 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:49 crc kubenswrapper[4804]: W0217 13:48:49.194259 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa87191a_671d_43c8_b8c2_e5e07a54af02.slice/crio-7c478cf9f4ed2e396f528fdf22823fcf8ddf5f04f1a3c2774ead4eafb4cdd61a WatchSource:0}: Error finding container 7c478cf9f4ed2e396f528fdf22823fcf8ddf5f04f1a3c2774ead4eafb4cdd61a: Status 404 returned error can't find the container with id 7c478cf9f4ed2e396f528fdf22823fcf8ddf5f04f1a3c2774ead4eafb4cdd61a Feb 17 13:48:49 crc kubenswrapper[4804]: I0217 13:48:49.343118 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aa87191a-671d-43c8-b8c2-e5e07a54af02","Type":"ContainerStarted","Data":"7c478cf9f4ed2e396f528fdf22823fcf8ddf5f04f1a3c2774ead4eafb4cdd61a"} Feb 17 13:48:49 crc kubenswrapper[4804]: I0217 13:48:49.934338 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.050312 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-config-data\") pod \"43796f1c-9838-40a1-9829-f878c2a7f076\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.050697 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnbnx\" (UniqueName: \"kubernetes.io/projected/43796f1c-9838-40a1-9829-f878c2a7f076-kube-api-access-hnbnx\") pod \"43796f1c-9838-40a1-9829-f878c2a7f076\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.050808 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-combined-ca-bundle\") pod \"43796f1c-9838-40a1-9829-f878c2a7f076\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.056804 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43796f1c-9838-40a1-9829-f878c2a7f076-kube-api-access-hnbnx" (OuterVolumeSpecName: "kube-api-access-hnbnx") pod "43796f1c-9838-40a1-9829-f878c2a7f076" (UID: "43796f1c-9838-40a1-9829-f878c2a7f076"). InnerVolumeSpecName "kube-api-access-hnbnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.100331 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43796f1c-9838-40a1-9829-f878c2a7f076" (UID: "43796f1c-9838-40a1-9829-f878c2a7f076"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.100373 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-config-data" (OuterVolumeSpecName: "config-data") pod "43796f1c-9838-40a1-9829-f878c2a7f076" (UID: "43796f1c-9838-40a1-9829-f878c2a7f076"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.152851 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.152900 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.152913 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnbnx\" (UniqueName: \"kubernetes.io/projected/43796f1c-9838-40a1-9829-f878c2a7f076-kube-api-access-hnbnx\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.370893 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aa87191a-671d-43c8-b8c2-e5e07a54af02","Type":"ContainerStarted","Data":"f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5"} Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.370963 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aa87191a-671d-43c8-b8c2-e5e07a54af02","Type":"ContainerStarted","Data":"10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679"} Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.374805 4804 generic.go:334] "Generic (PLEG): container finished" podID="43796f1c-9838-40a1-9829-f878c2a7f076" containerID="def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f" exitCode=0 Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.374852 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"43796f1c-9838-40a1-9829-f878c2a7f076","Type":"ContainerDied","Data":"def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f"} Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.374899 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"43796f1c-9838-40a1-9829-f878c2a7f076","Type":"ContainerDied","Data":"76fb016395e0231c3d8a7ae1865fe7cb74985c931d1b06a2d2e7c2491c7f5dbe"} Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.374917 4804 scope.go:117] "RemoveContainer" containerID="def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.374922 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.378934 4804 generic.go:334] "Generic (PLEG): container finished" podID="c11e165e-2605-470a-a865-230b274ce8d3" containerID="24aef71ff922a8ddea4d7c3429161120ea76c5281b5a5f51b9b913d40e9cb137" exitCode=0 Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.379125 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wq5kj" event={"ID":"c11e165e-2605-470a-a865-230b274ce8d3","Type":"ContainerDied","Data":"24aef71ff922a8ddea4d7c3429161120ea76c5281b5a5f51b9b913d40e9cb137"} Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.399991 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.399973515 podStartE2EDuration="2.399973515s" podCreationTimestamp="2026-02-17 13:48:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:48:50.389470075 +0000 UTC m=+1404.500889432" watchObservedRunningTime="2026-02-17 13:48:50.399973515 +0000 UTC m=+1404.511392852" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.400141 4804 scope.go:117] "RemoveContainer" containerID="def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f" Feb 17 13:48:50 crc kubenswrapper[4804]: E0217 13:48:50.400522 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f\": container with ID starting with def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f not found: ID does not exist" containerID="def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.400548 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f"} err="failed to get container status \"def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f\": rpc error: code = NotFound desc = could not find container \"def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f\": container with ID starting with def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f not found: ID does not exist" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.442394 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.452493 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.467691 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:48:50 crc kubenswrapper[4804]: E0217 13:48:50.468038 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43796f1c-9838-40a1-9829-f878c2a7f076" containerName="nova-scheduler-scheduler" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.468058 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="43796f1c-9838-40a1-9829-f878c2a7f076" containerName="nova-scheduler-scheduler" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.468279 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="43796f1c-9838-40a1-9829-f878c2a7f076" containerName="nova-scheduler-scheduler" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.468929 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.472214 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.481369 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.586461 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43796f1c-9838-40a1-9829-f878c2a7f076" path="/var/lib/kubelet/pods/43796f1c-9838-40a1-9829-f878c2a7f076/volumes" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.661509 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f65hb\" (UniqueName: \"kubernetes.io/projected/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-kube-api-access-f65hb\") pod \"nova-scheduler-0\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.661819 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-config-data\") pod \"nova-scheduler-0\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.662167 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.764135 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f65hb\" (UniqueName: \"kubernetes.io/projected/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-kube-api-access-f65hb\") pod \"nova-scheduler-0\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.764984 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-config-data\") pod \"nova-scheduler-0\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.766520 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.769070 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-config-data\") pod \"nova-scheduler-0\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.769231 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.783941 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f65hb\" (UniqueName: \"kubernetes.io/projected/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-kube-api-access-f65hb\") pod \"nova-scheduler-0\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.793060 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.244443 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.389893 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"abf6d166-4c3f-4fb3-a3b5-f85d47adf823","Type":"ContainerStarted","Data":"45d059e86e213177abef9a85b8685f82c73749ae5fde7098a9e718ebf9c0ae93"} Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.729060 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.835659 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-scripts\") pod \"c11e165e-2605-470a-a865-230b274ce8d3\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.835702 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-config-data\") pod \"c11e165e-2605-470a-a865-230b274ce8d3\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.835769 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smxf4\" (UniqueName: \"kubernetes.io/projected/c11e165e-2605-470a-a865-230b274ce8d3-kube-api-access-smxf4\") pod \"c11e165e-2605-470a-a865-230b274ce8d3\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.835847 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-combined-ca-bundle\") pod \"c11e165e-2605-470a-a865-230b274ce8d3\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.839967 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c11e165e-2605-470a-a865-230b274ce8d3-kube-api-access-smxf4" (OuterVolumeSpecName: "kube-api-access-smxf4") pod "c11e165e-2605-470a-a865-230b274ce8d3" (UID: "c11e165e-2605-470a-a865-230b274ce8d3"). InnerVolumeSpecName "kube-api-access-smxf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.840260 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-scripts" (OuterVolumeSpecName: "scripts") pod "c11e165e-2605-470a-a865-230b274ce8d3" (UID: "c11e165e-2605-470a-a865-230b274ce8d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.883421 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c11e165e-2605-470a-a865-230b274ce8d3" (UID: "c11e165e-2605-470a-a865-230b274ce8d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.883800 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-config-data" (OuterVolumeSpecName: "config-data") pod "c11e165e-2605-470a-a865-230b274ce8d3" (UID: "c11e165e-2605-470a-a865-230b274ce8d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.937743 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.937782 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.937798 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smxf4\" (UniqueName: \"kubernetes.io/projected/c11e165e-2605-470a-a865-230b274ce8d3-kube-api-access-smxf4\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.937811 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.378179 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.402157 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wq5kj" event={"ID":"c11e165e-2605-470a-a865-230b274ce8d3","Type":"ContainerDied","Data":"e2acfd4d07f2a376a865d19f0462c02776c9702874f6443998a2d4c2b54946eb"} Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.402190 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.402219 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2acfd4d07f2a376a865d19f0462c02776c9702874f6443998a2d4c2b54946eb" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.404176 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"abf6d166-4c3f-4fb3-a3b5-f85d47adf823","Type":"ContainerStarted","Data":"74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a"} Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.407734 4804 generic.go:334] "Generic (PLEG): container finished" podID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerID="43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01" exitCode=0 Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.407782 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"80e4a011-e72b-4fea-b6cb-15425d5d5940","Type":"ContainerDied","Data":"43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01"} Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.407814 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"80e4a011-e72b-4fea-b6cb-15425d5d5940","Type":"ContainerDied","Data":"f4250a31bbaa8fb75b13d25a5ac1d90d53745e9ab78a95f6bcc9ab9e9e16fb06"} Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.407811 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.407830 4804 scope.go:117] "RemoveContainer" containerID="43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.440678 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.440654061 podStartE2EDuration="2.440654061s" podCreationTimestamp="2026-02-17 13:48:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:48:52.430697808 +0000 UTC m=+1406.542117155" watchObservedRunningTime="2026-02-17 13:48:52.440654061 +0000 UTC m=+1406.552073398" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.449281 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80e4a011-e72b-4fea-b6cb-15425d5d5940-logs\") pod \"80e4a011-e72b-4fea-b6cb-15425d5d5940\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.450831 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80e4a011-e72b-4fea-b6cb-15425d5d5940-logs" (OuterVolumeSpecName: "logs") pod "80e4a011-e72b-4fea-b6cb-15425d5d5940" (UID: "80e4a011-e72b-4fea-b6cb-15425d5d5940"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.452050 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-combined-ca-bundle\") pod \"80e4a011-e72b-4fea-b6cb-15425d5d5940\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.452211 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-config-data\") pod \"80e4a011-e72b-4fea-b6cb-15425d5d5940\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.452345 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gptwq\" (UniqueName: \"kubernetes.io/projected/80e4a011-e72b-4fea-b6cb-15425d5d5940-kube-api-access-gptwq\") pod \"80e4a011-e72b-4fea-b6cb-15425d5d5940\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.453792 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80e4a011-e72b-4fea-b6cb-15425d5d5940-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.460446 4804 scope.go:117] "RemoveContainer" containerID="23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.462073 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80e4a011-e72b-4fea-b6cb-15425d5d5940-kube-api-access-gptwq" (OuterVolumeSpecName: "kube-api-access-gptwq") pod "80e4a011-e72b-4fea-b6cb-15425d5d5940" (UID: "80e4a011-e72b-4fea-b6cb-15425d5d5940"). InnerVolumeSpecName "kube-api-access-gptwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.509464 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80e4a011-e72b-4fea-b6cb-15425d5d5940" (UID: "80e4a011-e72b-4fea-b6cb-15425d5d5940"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.514059 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-config-data" (OuterVolumeSpecName: "config-data") pod "80e4a011-e72b-4fea-b6cb-15425d5d5940" (UID: "80e4a011-e72b-4fea-b6cb-15425d5d5940"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.527335 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 13:48:52 crc kubenswrapper[4804]: E0217 13:48:52.527872 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerName="nova-api-api" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.527897 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerName="nova-api-api" Feb 17 13:48:52 crc kubenswrapper[4804]: E0217 13:48:52.527913 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerName="nova-api-log" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.527920 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerName="nova-api-log" Feb 17 13:48:52 crc kubenswrapper[4804]: E0217 13:48:52.527929 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c11e165e-2605-470a-a865-230b274ce8d3" containerName="nova-cell1-conductor-db-sync" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.527936 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="c11e165e-2605-470a-a865-230b274ce8d3" containerName="nova-cell1-conductor-db-sync" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.528237 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerName="nova-api-api" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.528260 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerName="nova-api-log" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.528272 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="c11e165e-2605-470a-a865-230b274ce8d3" containerName="nova-cell1-conductor-db-sync" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.529026 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.532470 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.536947 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.556768 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.556821 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.556831 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gptwq\" (UniqueName: \"kubernetes.io/projected/80e4a011-e72b-4fea-b6cb-15425d5d5940-kube-api-access-gptwq\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.587022 4804 scope.go:117] "RemoveContainer" containerID="43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01" Feb 17 13:48:52 crc kubenswrapper[4804]: E0217 13:48:52.587439 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01\": container with ID starting with 43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01 not found: ID does not exist" containerID="43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.587488 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01"} err="failed to get container status \"43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01\": rpc error: code = NotFound desc = could not find container \"43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01\": container with ID starting with 43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01 not found: ID does not exist" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.587517 4804 scope.go:117] "RemoveContainer" containerID="23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f" Feb 17 13:48:52 crc kubenswrapper[4804]: E0217 13:48:52.587847 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f\": container with ID starting with 23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f not found: ID does not exist" containerID="23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.587915 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f"} err="failed to get container status \"23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f\": rpc error: code = NotFound desc = could not find container \"23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f\": container with ID starting with 23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f not found: ID does not exist" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.658940 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbzvp\" (UniqueName: \"kubernetes.io/projected/a13dbc73-75fc-448b-af44-cb7018d1640e-kube-api-access-nbzvp\") pod \"nova-cell1-conductor-0\" (UID: \"a13dbc73-75fc-448b-af44-cb7018d1640e\") " pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.660043 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a13dbc73-75fc-448b-af44-cb7018d1640e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a13dbc73-75fc-448b-af44-cb7018d1640e\") " pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.661733 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a13dbc73-75fc-448b-af44-cb7018d1640e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a13dbc73-75fc-448b-af44-cb7018d1640e\") " pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.744763 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.753566 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.766992 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.768737 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a13dbc73-75fc-448b-af44-cb7018d1640e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a13dbc73-75fc-448b-af44-cb7018d1640e\") " pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.768837 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a13dbc73-75fc-448b-af44-cb7018d1640e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a13dbc73-75fc-448b-af44-cb7018d1640e\") " pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.768894 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbzvp\" (UniqueName: \"kubernetes.io/projected/a13dbc73-75fc-448b-af44-cb7018d1640e-kube-api-access-nbzvp\") pod \"nova-cell1-conductor-0\" (UID: \"a13dbc73-75fc-448b-af44-cb7018d1640e\") " pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.769111 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.772959 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.774057 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a13dbc73-75fc-448b-af44-cb7018d1640e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a13dbc73-75fc-448b-af44-cb7018d1640e\") " pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.774898 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a13dbc73-75fc-448b-af44-cb7018d1640e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a13dbc73-75fc-448b-af44-cb7018d1640e\") " pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.778866 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.789443 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbzvp\" (UniqueName: \"kubernetes.io/projected/a13dbc73-75fc-448b-af44-cb7018d1640e-kube-api-access-nbzvp\") pod \"nova-cell1-conductor-0\" (UID: \"a13dbc73-75fc-448b-af44-cb7018d1640e\") " pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.870260 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-logs\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.870549 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-config-data\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.870712 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmcgw\" (UniqueName: \"kubernetes.io/projected/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-kube-api-access-nmcgw\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.870821 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.883050 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.990168 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-config-data\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.990597 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.990624 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmcgw\" (UniqueName: \"kubernetes.io/projected/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-kube-api-access-nmcgw\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.990888 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-logs\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.991659 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-logs\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.998724 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-config-data\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.999010 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:53 crc kubenswrapper[4804]: I0217 13:48:53.021931 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmcgw\" (UniqueName: \"kubernetes.io/projected/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-kube-api-access-nmcgw\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:53 crc kubenswrapper[4804]: I0217 13:48:53.100147 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:48:53 crc kubenswrapper[4804]: I0217 13:48:53.333943 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 13:48:53 crc kubenswrapper[4804]: W0217 13:48:53.339595 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda13dbc73_75fc_448b_af44_cb7018d1640e.slice/crio-eddd1c07b9438ffa6bba5a0556b4300201c1c61622174476a6397c2ad1987c61 WatchSource:0}: Error finding container eddd1c07b9438ffa6bba5a0556b4300201c1c61622174476a6397c2ad1987c61: Status 404 returned error can't find the container with id eddd1c07b9438ffa6bba5a0556b4300201c1c61622174476a6397c2ad1987c61 Feb 17 13:48:53 crc kubenswrapper[4804]: I0217 13:48:53.421461 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a13dbc73-75fc-448b-af44-cb7018d1640e","Type":"ContainerStarted","Data":"eddd1c07b9438ffa6bba5a0556b4300201c1c61622174476a6397c2ad1987c61"} Feb 17 13:48:53 crc kubenswrapper[4804]: I0217 13:48:53.547173 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:48:53 crc kubenswrapper[4804]: W0217 13:48:53.551282 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d46aa4d_a4d9_4376_8c8f_2dee489f4662.slice/crio-d1232b5d92c13480d32625bfcaa956d7a1646000084566fa3ede73979577f667 WatchSource:0}: Error finding container d1232b5d92c13480d32625bfcaa956d7a1646000084566fa3ede73979577f667: Status 404 returned error can't find the container with id d1232b5d92c13480d32625bfcaa956d7a1646000084566fa3ede73979577f667 Feb 17 13:48:53 crc kubenswrapper[4804]: I0217 13:48:53.737612 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 13:48:53 crc kubenswrapper[4804]: I0217 13:48:53.737679 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 13:48:54 crc kubenswrapper[4804]: I0217 13:48:54.433580 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d46aa4d-a4d9-4376-8c8f-2dee489f4662","Type":"ContainerStarted","Data":"1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0"} Feb 17 13:48:54 crc kubenswrapper[4804]: I0217 13:48:54.433974 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d46aa4d-a4d9-4376-8c8f-2dee489f4662","Type":"ContainerStarted","Data":"e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd"} Feb 17 13:48:54 crc kubenswrapper[4804]: I0217 13:48:54.433987 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d46aa4d-a4d9-4376-8c8f-2dee489f4662","Type":"ContainerStarted","Data":"d1232b5d92c13480d32625bfcaa956d7a1646000084566fa3ede73979577f667"} Feb 17 13:48:54 crc kubenswrapper[4804]: I0217 13:48:54.438456 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a13dbc73-75fc-448b-af44-cb7018d1640e","Type":"ContainerStarted","Data":"839843a1103d6f617dc26c0fc61f8789035e22903b26a0675932659651d3a249"} Feb 17 13:48:54 crc kubenswrapper[4804]: I0217 13:48:54.438600 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:54 crc kubenswrapper[4804]: I0217 13:48:54.461417 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.4613965110000002 podStartE2EDuration="2.461396511s" podCreationTimestamp="2026-02-17 13:48:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:48:54.456085853 +0000 UTC m=+1408.567505200" watchObservedRunningTime="2026-02-17 13:48:54.461396511 +0000 UTC m=+1408.572815858" Feb 17 13:48:54 crc kubenswrapper[4804]: I0217 13:48:54.480815 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.48079674 podStartE2EDuration="2.48079674s" podCreationTimestamp="2026-02-17 13:48:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:48:54.474241654 +0000 UTC m=+1408.585660991" watchObservedRunningTime="2026-02-17 13:48:54.48079674 +0000 UTC m=+1408.592216077" Feb 17 13:48:54 crc kubenswrapper[4804]: I0217 13:48:54.587186 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80e4a011-e72b-4fea-b6cb-15425d5d5940" path="/var/lib/kubelet/pods/80e4a011-e72b-4fea-b6cb-15425d5d5940/volumes" Feb 17 13:48:55 crc kubenswrapper[4804]: I0217 13:48:55.793654 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 13:48:55 crc kubenswrapper[4804]: I0217 13:48:55.835468 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:48:55 crc kubenswrapper[4804]: I0217 13:48:55.835537 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:48:58 crc kubenswrapper[4804]: I0217 13:48:58.736398 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 13:48:58 crc kubenswrapper[4804]: I0217 13:48:58.736678 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 13:48:59 crc kubenswrapper[4804]: I0217 13:48:59.749384 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 13:48:59 crc kubenswrapper[4804]: I0217 13:48:59.749442 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 13:49:00 crc kubenswrapper[4804]: I0217 13:49:00.377907 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 17 13:49:00 crc kubenswrapper[4804]: I0217 13:49:00.794471 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 13:49:00 crc kubenswrapper[4804]: I0217 13:49:00.827039 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 13:49:01 crc kubenswrapper[4804]: I0217 13:49:01.562080 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 13:49:02 crc kubenswrapper[4804]: I0217 13:49:02.918929 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 17 13:49:03 crc kubenswrapper[4804]: I0217 13:49:03.100994 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 13:49:03 crc kubenswrapper[4804]: I0217 13:49:03.101308 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 13:49:03 crc kubenswrapper[4804]: I0217 13:49:03.865867 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 13:49:03 crc kubenswrapper[4804]: I0217 13:49:03.866088 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="cae6d84c-f65f-4ab2-a733-424ea34c680d" containerName="kube-state-metrics" containerID="cri-o://5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0" gracePeriod=30 Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.186812 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.187348 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.387267 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.506403 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncwbr\" (UniqueName: \"kubernetes.io/projected/cae6d84c-f65f-4ab2-a733-424ea34c680d-kube-api-access-ncwbr\") pod \"cae6d84c-f65f-4ab2-a733-424ea34c680d\" (UID: \"cae6d84c-f65f-4ab2-a733-424ea34c680d\") " Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.512933 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cae6d84c-f65f-4ab2-a733-424ea34c680d-kube-api-access-ncwbr" (OuterVolumeSpecName: "kube-api-access-ncwbr") pod "cae6d84c-f65f-4ab2-a733-424ea34c680d" (UID: "cae6d84c-f65f-4ab2-a733-424ea34c680d"). InnerVolumeSpecName "kube-api-access-ncwbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.596110 4804 generic.go:334] "Generic (PLEG): container finished" podID="cae6d84c-f65f-4ab2-a733-424ea34c680d" containerID="5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0" exitCode=2 Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.596161 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cae6d84c-f65f-4ab2-a733-424ea34c680d","Type":"ContainerDied","Data":"5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0"} Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.596186 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cae6d84c-f65f-4ab2-a733-424ea34c680d","Type":"ContainerDied","Data":"6af0a26e9132d4c61e6cb494719994825c6ff8368e85c8ef8c51fa4c2767ffd0"} Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.596224 4804 scope.go:117] "RemoveContainer" containerID="5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.596342 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.608102 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncwbr\" (UniqueName: \"kubernetes.io/projected/cae6d84c-f65f-4ab2-a733-424ea34c680d-kube-api-access-ncwbr\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.681512 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.696045 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.700081 4804 scope.go:117] "RemoveContainer" containerID="5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0" Feb 17 13:49:04 crc kubenswrapper[4804]: E0217 13:49:04.709363 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0\": container with ID starting with 5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0 not found: ID does not exist" containerID="5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.709426 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0"} err="failed to get container status \"5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0\": rpc error: code = NotFound desc = could not find container \"5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0\": container with ID starting with 5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0 not found: ID does not exist" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.728272 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 13:49:04 crc kubenswrapper[4804]: E0217 13:49:04.728764 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae6d84c-f65f-4ab2-a733-424ea34c680d" containerName="kube-state-metrics" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.728781 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae6d84c-f65f-4ab2-a733-424ea34c680d" containerName="kube-state-metrics" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.728993 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="cae6d84c-f65f-4ab2-a733-424ea34c680d" containerName="kube-state-metrics" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.729712 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.748178 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.748338 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.761599 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.811327 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmmz5\" (UniqueName: \"kubernetes.io/projected/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-kube-api-access-jmmz5\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.811441 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.811487 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.811516 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.912706 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmmz5\" (UniqueName: \"kubernetes.io/projected/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-kube-api-access-jmmz5\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.912782 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.912814 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.912836 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.918486 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.918588 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.921559 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.932999 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmmz5\" (UniqueName: \"kubernetes.io/projected/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-kube-api-access-jmmz5\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:05 crc kubenswrapper[4804]: I0217 13:49:05.091442 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 13:49:05 crc kubenswrapper[4804]: I0217 13:49:05.573848 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 13:49:05 crc kubenswrapper[4804]: W0217 13:49:05.577167 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6aabf20_b0bf_4f35_aec7_098f38bacfd9.slice/crio-8dec676dd6a90d7dfb73a1108ca6ea866b7b2f1642539f4552196380d13d9358 WatchSource:0}: Error finding container 8dec676dd6a90d7dfb73a1108ca6ea866b7b2f1642539f4552196380d13d9358: Status 404 returned error can't find the container with id 8dec676dd6a90d7dfb73a1108ca6ea866b7b2f1642539f4552196380d13d9358 Feb 17 13:49:05 crc kubenswrapper[4804]: I0217 13:49:05.605549 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d6aabf20-b0bf-4f35-aec7-098f38bacfd9","Type":"ContainerStarted","Data":"8dec676dd6a90d7dfb73a1108ca6ea866b7b2f1642539f4552196380d13d9358"} Feb 17 13:49:05 crc kubenswrapper[4804]: I0217 13:49:05.839995 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:05 crc kubenswrapper[4804]: I0217 13:49:05.840647 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="ceilometer-central-agent" containerID="cri-o://7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37" gracePeriod=30 Feb 17 13:49:05 crc kubenswrapper[4804]: I0217 13:49:05.840780 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="proxy-httpd" containerID="cri-o://739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1" gracePeriod=30 Feb 17 13:49:05 crc kubenswrapper[4804]: I0217 13:49:05.840869 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="ceilometer-notification-agent" containerID="cri-o://c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607" gracePeriod=30 Feb 17 13:49:05 crc kubenswrapper[4804]: I0217 13:49:05.840851 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="sg-core" containerID="cri-o://626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10" gracePeriod=30 Feb 17 13:49:06 crc kubenswrapper[4804]: I0217 13:49:06.585796 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cae6d84c-f65f-4ab2-a733-424ea34c680d" path="/var/lib/kubelet/pods/cae6d84c-f65f-4ab2-a733-424ea34c680d/volumes" Feb 17 13:49:06 crc kubenswrapper[4804]: I0217 13:49:06.617733 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d6aabf20-b0bf-4f35-aec7-098f38bacfd9","Type":"ContainerStarted","Data":"58b9835c32417f29c65635ceb7ce6b84e66c392dc8d4953534bc30e129934091"} Feb 17 13:49:06 crc kubenswrapper[4804]: I0217 13:49:06.618619 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 17 13:49:06 crc kubenswrapper[4804]: I0217 13:49:06.621160 4804 generic.go:334] "Generic (PLEG): container finished" podID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerID="739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1" exitCode=0 Feb 17 13:49:06 crc kubenswrapper[4804]: I0217 13:49:06.621191 4804 generic.go:334] "Generic (PLEG): container finished" podID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerID="626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10" exitCode=2 Feb 17 13:49:06 crc kubenswrapper[4804]: I0217 13:49:06.621221 4804 generic.go:334] "Generic (PLEG): container finished" podID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerID="7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37" exitCode=0 Feb 17 13:49:06 crc kubenswrapper[4804]: I0217 13:49:06.621247 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e6284b7-c2bf-491d-a8b8-66390efc3657","Type":"ContainerDied","Data":"739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1"} Feb 17 13:49:06 crc kubenswrapper[4804]: I0217 13:49:06.621271 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e6284b7-c2bf-491d-a8b8-66390efc3657","Type":"ContainerDied","Data":"626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10"} Feb 17 13:49:06 crc kubenswrapper[4804]: I0217 13:49:06.621285 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e6284b7-c2bf-491d-a8b8-66390efc3657","Type":"ContainerDied","Data":"7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37"} Feb 17 13:49:06 crc kubenswrapper[4804]: I0217 13:49:06.643898 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.237444754 podStartE2EDuration="2.643868619s" podCreationTimestamp="2026-02-17 13:49:04 +0000 UTC" firstStartedPulling="2026-02-17 13:49:05.579246504 +0000 UTC m=+1419.690665841" lastFinishedPulling="2026-02-17 13:49:05.985670369 +0000 UTC m=+1420.097089706" observedRunningTime="2026-02-17 13:49:06.633765592 +0000 UTC m=+1420.745184929" watchObservedRunningTime="2026-02-17 13:49:06.643868619 +0000 UTC m=+1420.755287956" Feb 17 13:49:08 crc kubenswrapper[4804]: I0217 13:49:08.742392 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 13:49:08 crc kubenswrapper[4804]: I0217 13:49:08.747055 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 13:49:08 crc kubenswrapper[4804]: I0217 13:49:08.752822 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 13:49:09 crc kubenswrapper[4804]: I0217 13:49:09.665720 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.094005 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.206823 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-config-data\") pod \"0e6284b7-c2bf-491d-a8b8-66390efc3657\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.206880 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-run-httpd\") pod \"0e6284b7-c2bf-491d-a8b8-66390efc3657\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.206948 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-combined-ca-bundle\") pod \"0e6284b7-c2bf-491d-a8b8-66390efc3657\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.206988 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l9t6\" (UniqueName: \"kubernetes.io/projected/0e6284b7-c2bf-491d-a8b8-66390efc3657-kube-api-access-5l9t6\") pod \"0e6284b7-c2bf-491d-a8b8-66390efc3657\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.207055 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-log-httpd\") pod \"0e6284b7-c2bf-491d-a8b8-66390efc3657\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.207095 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-sg-core-conf-yaml\") pod \"0e6284b7-c2bf-491d-a8b8-66390efc3657\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.207189 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-scripts\") pod \"0e6284b7-c2bf-491d-a8b8-66390efc3657\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.208681 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0e6284b7-c2bf-491d-a8b8-66390efc3657" (UID: "0e6284b7-c2bf-491d-a8b8-66390efc3657"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.208756 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0e6284b7-c2bf-491d-a8b8-66390efc3657" (UID: "0e6284b7-c2bf-491d-a8b8-66390efc3657"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.214681 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-scripts" (OuterVolumeSpecName: "scripts") pod "0e6284b7-c2bf-491d-a8b8-66390efc3657" (UID: "0e6284b7-c2bf-491d-a8b8-66390efc3657"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.234443 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0e6284b7-c2bf-491d-a8b8-66390efc3657" (UID: "0e6284b7-c2bf-491d-a8b8-66390efc3657"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.238371 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e6284b7-c2bf-491d-a8b8-66390efc3657-kube-api-access-5l9t6" (OuterVolumeSpecName: "kube-api-access-5l9t6") pod "0e6284b7-c2bf-491d-a8b8-66390efc3657" (UID: "0e6284b7-c2bf-491d-a8b8-66390efc3657"). InnerVolumeSpecName "kube-api-access-5l9t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.310361 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.310388 4804 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.310398 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l9t6\" (UniqueName: \"kubernetes.io/projected/0e6284b7-c2bf-491d-a8b8-66390efc3657-kube-api-access-5l9t6\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.310444 4804 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.310452 4804 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.312497 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e6284b7-c2bf-491d-a8b8-66390efc3657" (UID: "0e6284b7-c2bf-491d-a8b8-66390efc3657"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.327799 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-config-data" (OuterVolumeSpecName: "config-data") pod "0e6284b7-c2bf-491d-a8b8-66390efc3657" (UID: "0e6284b7-c2bf-491d-a8b8-66390efc3657"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.412259 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.412290 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.668811 4804 generic.go:334] "Generic (PLEG): container finished" podID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerID="c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607" exitCode=0 Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.668865 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e6284b7-c2bf-491d-a8b8-66390efc3657","Type":"ContainerDied","Data":"c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607"} Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.668916 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e6284b7-c2bf-491d-a8b8-66390efc3657","Type":"ContainerDied","Data":"2d2e5d5016d0e7547bab751744c5123f906da3f81613bad821f32b24482acee8"} Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.668919 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.668933 4804 scope.go:117] "RemoveContainer" containerID="739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.691165 4804 scope.go:117] "RemoveContainer" containerID="626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.694179 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.705240 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.713006 4804 scope.go:117] "RemoveContainer" containerID="c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.723125 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:10 crc kubenswrapper[4804]: E0217 13:49:10.723578 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="sg-core" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.723598 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="sg-core" Feb 17 13:49:10 crc kubenswrapper[4804]: E0217 13:49:10.723610 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="proxy-httpd" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.723616 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="proxy-httpd" Feb 17 13:49:10 crc kubenswrapper[4804]: E0217 13:49:10.723636 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="ceilometer-notification-agent" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.723642 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="ceilometer-notification-agent" Feb 17 13:49:10 crc kubenswrapper[4804]: E0217 13:49:10.723653 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="ceilometer-central-agent" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.723658 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="ceilometer-central-agent" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.723837 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="ceilometer-central-agent" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.723854 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="sg-core" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.723870 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="ceilometer-notification-agent" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.723881 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="proxy-httpd" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.725550 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.732281 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.732555 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.732869 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.734878 4804 scope.go:117] "RemoveContainer" containerID="7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.746339 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.822358 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.822408 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.822586 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-log-httpd\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.822622 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-scripts\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.822640 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-run-httpd\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.822683 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.822937 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-config-data\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.822997 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24pj6\" (UniqueName: \"kubernetes.io/projected/8813e275-f23d-497a-a085-0ac6e26ab8c0-kube-api-access-24pj6\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.856368 4804 scope.go:117] "RemoveContainer" containerID="739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1" Feb 17 13:49:10 crc kubenswrapper[4804]: E0217 13:49:10.857286 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1\": container with ID starting with 739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1 not found: ID does not exist" containerID="739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.857348 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1"} err="failed to get container status \"739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1\": rpc error: code = NotFound desc = could not find container \"739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1\": container with ID starting with 739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1 not found: ID does not exist" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.857379 4804 scope.go:117] "RemoveContainer" containerID="626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10" Feb 17 13:49:10 crc kubenswrapper[4804]: E0217 13:49:10.857763 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10\": container with ID starting with 626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10 not found: ID does not exist" containerID="626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.857787 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10"} err="failed to get container status \"626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10\": rpc error: code = NotFound desc = could not find container \"626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10\": container with ID starting with 626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10 not found: ID does not exist" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.857804 4804 scope.go:117] "RemoveContainer" containerID="c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607" Feb 17 13:49:10 crc kubenswrapper[4804]: E0217 13:49:10.858063 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607\": container with ID starting with c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607 not found: ID does not exist" containerID="c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.858088 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607"} err="failed to get container status \"c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607\": rpc error: code = NotFound desc = could not find container \"c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607\": container with ID starting with c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607 not found: ID does not exist" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.858105 4804 scope.go:117] "RemoveContainer" containerID="7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37" Feb 17 13:49:10 crc kubenswrapper[4804]: E0217 13:49:10.858320 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37\": container with ID starting with 7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37 not found: ID does not exist" containerID="7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.858346 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37"} err="failed to get container status \"7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37\": rpc error: code = NotFound desc = could not find container \"7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37\": container with ID starting with 7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37 not found: ID does not exist" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.925160 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-config-data\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.925243 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24pj6\" (UniqueName: \"kubernetes.io/projected/8813e275-f23d-497a-a085-0ac6e26ab8c0-kube-api-access-24pj6\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.925321 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.925345 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.925421 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-log-httpd\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.925454 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-scripts\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.925473 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-run-httpd\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.925508 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.926483 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-log-httpd\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.926672 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-run-httpd\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.932118 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.932271 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-scripts\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.932997 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.937389 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-config-data\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.938184 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.943156 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24pj6\" (UniqueName: \"kubernetes.io/projected/8813e275-f23d-497a-a085-0ac6e26ab8c0-kube-api-access-24pj6\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.154785 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.317797 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.435423 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-combined-ca-bundle\") pod \"aeb819ef-7656-4054-baa2-02efb705872d\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.435525 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8n6q\" (UniqueName: \"kubernetes.io/projected/aeb819ef-7656-4054-baa2-02efb705872d-kube-api-access-b8n6q\") pod \"aeb819ef-7656-4054-baa2-02efb705872d\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.435710 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-config-data\") pod \"aeb819ef-7656-4054-baa2-02efb705872d\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.440191 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeb819ef-7656-4054-baa2-02efb705872d-kube-api-access-b8n6q" (OuterVolumeSpecName: "kube-api-access-b8n6q") pod "aeb819ef-7656-4054-baa2-02efb705872d" (UID: "aeb819ef-7656-4054-baa2-02efb705872d"). InnerVolumeSpecName "kube-api-access-b8n6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.465464 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-config-data" (OuterVolumeSpecName: "config-data") pod "aeb819ef-7656-4054-baa2-02efb705872d" (UID: "aeb819ef-7656-4054-baa2-02efb705872d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.465742 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aeb819ef-7656-4054-baa2-02efb705872d" (UID: "aeb819ef-7656-4054-baa2-02efb705872d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.538267 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.538311 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8n6q\" (UniqueName: \"kubernetes.io/projected/aeb819ef-7656-4054-baa2-02efb705872d-kube-api-access-b8n6q\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.538323 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:11 crc kubenswrapper[4804]: W0217 13:49:11.662726 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8813e275_f23d_497a_a085_0ac6e26ab8c0.slice/crio-4f38f1401a13eb1042fdd0203f8e9dd2f66fd959325271d3b392c37b7cce4024 WatchSource:0}: Error finding container 4f38f1401a13eb1042fdd0203f8e9dd2f66fd959325271d3b392c37b7cce4024: Status 404 returned error can't find the container with id 4f38f1401a13eb1042fdd0203f8e9dd2f66fd959325271d3b392c37b7cce4024 Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.664664 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.678543 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8813e275-f23d-497a-a085-0ac6e26ab8c0","Type":"ContainerStarted","Data":"4f38f1401a13eb1042fdd0203f8e9dd2f66fd959325271d3b392c37b7cce4024"} Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.680102 4804 generic.go:334] "Generic (PLEG): container finished" podID="aeb819ef-7656-4054-baa2-02efb705872d" containerID="12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87" exitCode=137 Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.680144 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"aeb819ef-7656-4054-baa2-02efb705872d","Type":"ContainerDied","Data":"12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87"} Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.680160 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"aeb819ef-7656-4054-baa2-02efb705872d","Type":"ContainerDied","Data":"4ffce2ca7928c17f5cf87a7c53dec619e957c317bf58abe39325d5edeb55c199"} Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.680175 4804 scope.go:117] "RemoveContainer" containerID="12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.680270 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.708295 4804 scope.go:117] "RemoveContainer" containerID="12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87" Feb 17 13:49:11 crc kubenswrapper[4804]: E0217 13:49:11.709546 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87\": container with ID starting with 12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87 not found: ID does not exist" containerID="12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.709585 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87"} err="failed to get container status \"12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87\": rpc error: code = NotFound desc = could not find container \"12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87\": container with ID starting with 12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87 not found: ID does not exist" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.710421 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.718603 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.731714 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 13:49:11 crc kubenswrapper[4804]: E0217 13:49:11.732163 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeb819ef-7656-4054-baa2-02efb705872d" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.732183 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeb819ef-7656-4054-baa2-02efb705872d" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.732416 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeb819ef-7656-4054-baa2-02efb705872d" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.733089 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.737325 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.737431 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.737757 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.744277 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.842819 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6xsm\" (UniqueName: \"kubernetes.io/projected/5c380610-c164-4798-a5df-9b90fd475667-kube-api-access-m6xsm\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.842965 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.843154 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.843312 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.843475 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.944767 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.944954 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.945001 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6xsm\" (UniqueName: \"kubernetes.io/projected/5c380610-c164-4798-a5df-9b90fd475667-kube-api-access-m6xsm\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.945044 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.945097 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.950784 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.950985 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.951346 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.956982 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.963824 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6xsm\" (UniqueName: \"kubernetes.io/projected/5c380610-c164-4798-a5df-9b90fd475667-kube-api-access-m6xsm\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:12 crc kubenswrapper[4804]: I0217 13:49:12.057992 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:12 crc kubenswrapper[4804]: I0217 13:49:12.512016 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 13:49:12 crc kubenswrapper[4804]: I0217 13:49:12.592574 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" path="/var/lib/kubelet/pods/0e6284b7-c2bf-491d-a8b8-66390efc3657/volumes" Feb 17 13:49:12 crc kubenswrapper[4804]: I0217 13:49:12.593589 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeb819ef-7656-4054-baa2-02efb705872d" path="/var/lib/kubelet/pods/aeb819ef-7656-4054-baa2-02efb705872d/volumes" Feb 17 13:49:12 crc kubenswrapper[4804]: I0217 13:49:12.691277 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5c380610-c164-4798-a5df-9b90fd475667","Type":"ContainerStarted","Data":"d69c90225ba282cd52c0e4053112dc70f43cb765cbf91891dfd4fd705ac37225"} Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.104264 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.105706 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.108921 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.113911 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.702156 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5c380610-c164-4798-a5df-9b90fd475667","Type":"ContainerStarted","Data":"3e3b56f303907257935d1c0c65df81e464e3beecf73066a6cd9b9dee8ec04501"} Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.706007 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8813e275-f23d-497a-a085-0ac6e26ab8c0","Type":"ContainerStarted","Data":"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855"} Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.706076 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.712899 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.724005 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.723980262 podStartE2EDuration="2.723980262s" podCreationTimestamp="2026-02-17 13:49:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:49:13.721143893 +0000 UTC m=+1427.832563230" watchObservedRunningTime="2026-02-17 13:49:13.723980262 +0000 UTC m=+1427.835399599" Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.904703 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6rhsw"] Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.906375 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.938400 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6rhsw"] Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.092472 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jbtz\" (UniqueName: \"kubernetes.io/projected/1fd51afd-ae34-4a67-bb79-a12d396968ef-kube-api-access-5jbtz\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.092766 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.092906 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.092999 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.093809 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-config\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.093936 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.195407 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-config\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.195699 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.196238 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-config\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.196603 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.197023 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jbtz\" (UniqueName: \"kubernetes.io/projected/1fd51afd-ae34-4a67-bb79-a12d396968ef-kube-api-access-5jbtz\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.197506 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.198329 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.199365 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.199510 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.199654 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.200455 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.227536 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jbtz\" (UniqueName: \"kubernetes.io/projected/1fd51afd-ae34-4a67-bb79-a12d396968ef-kube-api-access-5jbtz\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.232619 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.789120 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6rhsw"] Feb 17 13:49:15 crc kubenswrapper[4804]: I0217 13:49:15.103111 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 17 13:49:15 crc kubenswrapper[4804]: I0217 13:49:15.724916 4804 generic.go:334] "Generic (PLEG): container finished" podID="1fd51afd-ae34-4a67-bb79-a12d396968ef" containerID="205ba7bc73ab52ae1a2e677c3dd98e1d2935baea4c419316aadd89754cd8dfba" exitCode=0 Feb 17 13:49:15 crc kubenswrapper[4804]: I0217 13:49:15.724986 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" event={"ID":"1fd51afd-ae34-4a67-bb79-a12d396968ef","Type":"ContainerDied","Data":"205ba7bc73ab52ae1a2e677c3dd98e1d2935baea4c419316aadd89754cd8dfba"} Feb 17 13:49:15 crc kubenswrapper[4804]: I0217 13:49:15.725021 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" event={"ID":"1fd51afd-ae34-4a67-bb79-a12d396968ef","Type":"ContainerStarted","Data":"a2838e3552cf9ee264c86b1e5acbfc8482d43bbc95f8a3776ff5253f31fed64a"} Feb 17 13:49:16 crc kubenswrapper[4804]: I0217 13:49:16.450304 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:16 crc kubenswrapper[4804]: I0217 13:49:16.734071 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" event={"ID":"1fd51afd-ae34-4a67-bb79-a12d396968ef","Type":"ContainerStarted","Data":"d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b"} Feb 17 13:49:16 crc kubenswrapper[4804]: I0217 13:49:16.734245 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:16 crc kubenswrapper[4804]: I0217 13:49:16.734374 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerName="nova-api-log" containerID="cri-o://e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd" gracePeriod=30 Feb 17 13:49:16 crc kubenswrapper[4804]: I0217 13:49:16.734433 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerName="nova-api-api" containerID="cri-o://1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0" gracePeriod=30 Feb 17 13:49:16 crc kubenswrapper[4804]: I0217 13:49:16.759319 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" podStartSLOduration=3.759302102 podStartE2EDuration="3.759302102s" podCreationTimestamp="2026-02-17 13:49:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:49:16.758420504 +0000 UTC m=+1430.869839851" watchObservedRunningTime="2026-02-17 13:49:16.759302102 +0000 UTC m=+1430.870721439" Feb 17 13:49:17 crc kubenswrapper[4804]: I0217 13:49:17.058240 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:17 crc kubenswrapper[4804]: I0217 13:49:17.364768 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:17 crc kubenswrapper[4804]: I0217 13:49:17.748871 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8813e275-f23d-497a-a085-0ac6e26ab8c0","Type":"ContainerStarted","Data":"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363"} Feb 17 13:49:17 crc kubenswrapper[4804]: I0217 13:49:17.750365 4804 generic.go:334] "Generic (PLEG): container finished" podID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerID="e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd" exitCode=143 Feb 17 13:49:17 crc kubenswrapper[4804]: I0217 13:49:17.750534 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d46aa4d-a4d9-4376-8c8f-2dee489f4662","Type":"ContainerDied","Data":"e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd"} Feb 17 13:49:18 crc kubenswrapper[4804]: I0217 13:49:18.765331 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8813e275-f23d-497a-a085-0ac6e26ab8c0","Type":"ContainerStarted","Data":"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec"} Feb 17 13:49:19 crc kubenswrapper[4804]: I0217 13:49:19.782289 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8813e275-f23d-497a-a085-0ac6e26ab8c0","Type":"ContainerStarted","Data":"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac"} Feb 17 13:49:19 crc kubenswrapper[4804]: I0217 13:49:19.783356 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="ceilometer-central-agent" containerID="cri-o://cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855" gracePeriod=30 Feb 17 13:49:19 crc kubenswrapper[4804]: I0217 13:49:19.783728 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 13:49:19 crc kubenswrapper[4804]: I0217 13:49:19.784053 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="proxy-httpd" containerID="cri-o://c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac" gracePeriod=30 Feb 17 13:49:19 crc kubenswrapper[4804]: I0217 13:49:19.784283 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="ceilometer-notification-agent" containerID="cri-o://867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363" gracePeriod=30 Feb 17 13:49:19 crc kubenswrapper[4804]: I0217 13:49:19.784362 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="sg-core" containerID="cri-o://b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec" gracePeriod=30 Feb 17 13:49:19 crc kubenswrapper[4804]: I0217 13:49:19.815227 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.45422059 podStartE2EDuration="9.8152111s" podCreationTimestamp="2026-02-17 13:49:10 +0000 UTC" firstStartedPulling="2026-02-17 13:49:11.664670381 +0000 UTC m=+1425.776089718" lastFinishedPulling="2026-02-17 13:49:19.025660891 +0000 UTC m=+1433.137080228" observedRunningTime="2026-02-17 13:49:19.814067874 +0000 UTC m=+1433.925487211" watchObservedRunningTime="2026-02-17 13:49:19.8152111 +0000 UTC m=+1433.926630437" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.321191 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.413705 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-combined-ca-bundle\") pod \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.413762 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmcgw\" (UniqueName: \"kubernetes.io/projected/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-kube-api-access-nmcgw\") pod \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.413815 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-logs\") pod \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.413895 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-config-data\") pod \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.414746 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-logs" (OuterVolumeSpecName: "logs") pod "2d46aa4d-a4d9-4376-8c8f-2dee489f4662" (UID: "2d46aa4d-a4d9-4376-8c8f-2dee489f4662"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.434246 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-kube-api-access-nmcgw" (OuterVolumeSpecName: "kube-api-access-nmcgw") pod "2d46aa4d-a4d9-4376-8c8f-2dee489f4662" (UID: "2d46aa4d-a4d9-4376-8c8f-2dee489f4662"). InnerVolumeSpecName "kube-api-access-nmcgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.443486 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-config-data" (OuterVolumeSpecName: "config-data") pod "2d46aa4d-a4d9-4376-8c8f-2dee489f4662" (UID: "2d46aa4d-a4d9-4376-8c8f-2dee489f4662"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.445069 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d46aa4d-a4d9-4376-8c8f-2dee489f4662" (UID: "2d46aa4d-a4d9-4376-8c8f-2dee489f4662"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.489272 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.516401 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.516440 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmcgw\" (UniqueName: \"kubernetes.io/projected/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-kube-api-access-nmcgw\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.516454 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.516467 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.617607 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24pj6\" (UniqueName: \"kubernetes.io/projected/8813e275-f23d-497a-a085-0ac6e26ab8c0-kube-api-access-24pj6\") pod \"8813e275-f23d-497a-a085-0ac6e26ab8c0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.617679 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-combined-ca-bundle\") pod \"8813e275-f23d-497a-a085-0ac6e26ab8c0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.617724 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-log-httpd\") pod \"8813e275-f23d-497a-a085-0ac6e26ab8c0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.617799 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-config-data\") pod \"8813e275-f23d-497a-a085-0ac6e26ab8c0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.617863 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-sg-core-conf-yaml\") pod \"8813e275-f23d-497a-a085-0ac6e26ab8c0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.618329 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8813e275-f23d-497a-a085-0ac6e26ab8c0" (UID: "8813e275-f23d-497a-a085-0ac6e26ab8c0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.618673 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-ceilometer-tls-certs\") pod \"8813e275-f23d-497a-a085-0ac6e26ab8c0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.618752 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-run-httpd\") pod \"8813e275-f23d-497a-a085-0ac6e26ab8c0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.618781 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-scripts\") pod \"8813e275-f23d-497a-a085-0ac6e26ab8c0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.618974 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8813e275-f23d-497a-a085-0ac6e26ab8c0" (UID: "8813e275-f23d-497a-a085-0ac6e26ab8c0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.619609 4804 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.619625 4804 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.632070 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-scripts" (OuterVolumeSpecName: "scripts") pod "8813e275-f23d-497a-a085-0ac6e26ab8c0" (UID: "8813e275-f23d-497a-a085-0ac6e26ab8c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.643008 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8813e275-f23d-497a-a085-0ac6e26ab8c0-kube-api-access-24pj6" (OuterVolumeSpecName: "kube-api-access-24pj6") pod "8813e275-f23d-497a-a085-0ac6e26ab8c0" (UID: "8813e275-f23d-497a-a085-0ac6e26ab8c0"). InnerVolumeSpecName "kube-api-access-24pj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.656856 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8813e275-f23d-497a-a085-0ac6e26ab8c0" (UID: "8813e275-f23d-497a-a085-0ac6e26ab8c0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.680789 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8813e275-f23d-497a-a085-0ac6e26ab8c0" (UID: "8813e275-f23d-497a-a085-0ac6e26ab8c0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.701874 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8813e275-f23d-497a-a085-0ac6e26ab8c0" (UID: "8813e275-f23d-497a-a085-0ac6e26ab8c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.721111 4804 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.721149 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.721164 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24pj6\" (UniqueName: \"kubernetes.io/projected/8813e275-f23d-497a-a085-0ac6e26ab8c0-kube-api-access-24pj6\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.721178 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.721187 4804 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.725163 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-config-data" (OuterVolumeSpecName: "config-data") pod "8813e275-f23d-497a-a085-0ac6e26ab8c0" (UID: "8813e275-f23d-497a-a085-0ac6e26ab8c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.793916 4804 generic.go:334] "Generic (PLEG): container finished" podID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerID="c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac" exitCode=0 Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.793951 4804 generic.go:334] "Generic (PLEG): container finished" podID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerID="b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec" exitCode=2 Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.793959 4804 generic.go:334] "Generic (PLEG): container finished" podID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerID="867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363" exitCode=0 Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.793969 4804 generic.go:334] "Generic (PLEG): container finished" podID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerID="cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855" exitCode=0 Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.793980 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.794034 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8813e275-f23d-497a-a085-0ac6e26ab8c0","Type":"ContainerDied","Data":"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac"} Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.794092 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8813e275-f23d-497a-a085-0ac6e26ab8c0","Type":"ContainerDied","Data":"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec"} Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.794112 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8813e275-f23d-497a-a085-0ac6e26ab8c0","Type":"ContainerDied","Data":"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363"} Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.794127 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8813e275-f23d-497a-a085-0ac6e26ab8c0","Type":"ContainerDied","Data":"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855"} Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.794140 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8813e275-f23d-497a-a085-0ac6e26ab8c0","Type":"ContainerDied","Data":"4f38f1401a13eb1042fdd0203f8e9dd2f66fd959325271d3b392c37b7cce4024"} Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.794160 4804 scope.go:117] "RemoveContainer" containerID="c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.796441 4804 generic.go:334] "Generic (PLEG): container finished" podID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerID="1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0" exitCode=0 Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.796479 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d46aa4d-a4d9-4376-8c8f-2dee489f4662","Type":"ContainerDied","Data":"1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0"} Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.796509 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d46aa4d-a4d9-4376-8c8f-2dee489f4662","Type":"ContainerDied","Data":"d1232b5d92c13480d32625bfcaa956d7a1646000084566fa3ede73979577f667"} Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.796574 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.821226 4804 scope.go:117] "RemoveContainer" containerID="b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.822400 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.829780 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.842374 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.850940 4804 scope.go:117] "RemoveContainer" containerID="867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.855233 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886041 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.886545 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerName="nova-api-api" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886569 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerName="nova-api-api" Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.886584 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="sg-core" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886593 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="sg-core" Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.886606 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="ceilometer-notification-agent" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886615 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="ceilometer-notification-agent" Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.886625 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="proxy-httpd" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886633 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="proxy-httpd" Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.886670 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerName="nova-api-log" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886677 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerName="nova-api-log" Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.886691 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="ceilometer-central-agent" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886698 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="ceilometer-central-agent" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886908 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="sg-core" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886925 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerName="nova-api-log" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886947 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="ceilometer-notification-agent" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886966 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="ceilometer-central-agent" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886981 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerName="nova-api-api" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886994 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="proxy-httpd" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.888188 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.890040 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.890342 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.891016 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.907419 4804 scope.go:117] "RemoveContainer" containerID="cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.913835 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.923807 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.932489 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.934946 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.935103 4804 scope.go:117] "RemoveContainer" containerID="c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac" Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.936950 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac\": container with ID starting with c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac not found: ID does not exist" containerID="c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.936984 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac"} err="failed to get container status \"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac\": rpc error: code = NotFound desc = could not find container \"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac\": container with ID starting with c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.937006 4804 scope.go:117] "RemoveContainer" containerID="b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.937898 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.937927 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.939018 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec\": container with ID starting with b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec not found: ID does not exist" containerID="b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.939053 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec"} err="failed to get container status \"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec\": rpc error: code = NotFound desc = could not find container \"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec\": container with ID starting with b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.939076 4804 scope.go:117] "RemoveContainer" containerID="867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363" Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.939345 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363\": container with ID starting with 867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363 not found: ID does not exist" containerID="867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.939375 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363"} err="failed to get container status \"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363\": rpc error: code = NotFound desc = could not find container \"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363\": container with ID starting with 867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363 not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.939391 4804 scope.go:117] "RemoveContainer" containerID="cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.939419 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.939707 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855\": container with ID starting with cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855 not found: ID does not exist" containerID="cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.939798 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855"} err="failed to get container status \"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855\": rpc error: code = NotFound desc = could not find container \"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855\": container with ID starting with cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855 not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.939826 4804 scope.go:117] "RemoveContainer" containerID="c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.940127 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac"} err="failed to get container status \"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac\": rpc error: code = NotFound desc = could not find container \"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac\": container with ID starting with c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.940155 4804 scope.go:117] "RemoveContainer" containerID="b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.940439 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec"} err="failed to get container status \"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec\": rpc error: code = NotFound desc = could not find container \"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec\": container with ID starting with b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.940463 4804 scope.go:117] "RemoveContainer" containerID="867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.940785 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363"} err="failed to get container status \"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363\": rpc error: code = NotFound desc = could not find container \"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363\": container with ID starting with 867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363 not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.940809 4804 scope.go:117] "RemoveContainer" containerID="cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.943645 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.949874 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855"} err="failed to get container status \"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855\": rpc error: code = NotFound desc = could not find container \"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855\": container with ID starting with cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855 not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.949930 4804 scope.go:117] "RemoveContainer" containerID="c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.950274 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac"} err="failed to get container status \"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac\": rpc error: code = NotFound desc = could not find container \"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac\": container with ID starting with c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.950316 4804 scope.go:117] "RemoveContainer" containerID="b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.950609 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec"} err="failed to get container status \"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec\": rpc error: code = NotFound desc = could not find container \"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec\": container with ID starting with b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.950637 4804 scope.go:117] "RemoveContainer" containerID="867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.950881 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363"} err="failed to get container status \"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363\": rpc error: code = NotFound desc = could not find container \"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363\": container with ID starting with 867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363 not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.950914 4804 scope.go:117] "RemoveContainer" containerID="cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.951125 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855"} err="failed to get container status \"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855\": rpc error: code = NotFound desc = could not find container \"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855\": container with ID starting with cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855 not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.951150 4804 scope.go:117] "RemoveContainer" containerID="c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.951443 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac"} err="failed to get container status \"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac\": rpc error: code = NotFound desc = could not find container \"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac\": container with ID starting with c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.951501 4804 scope.go:117] "RemoveContainer" containerID="b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.951749 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec"} err="failed to get container status \"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec\": rpc error: code = NotFound desc = could not find container \"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec\": container with ID starting with b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.951773 4804 scope.go:117] "RemoveContainer" containerID="867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.951941 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363"} err="failed to get container status \"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363\": rpc error: code = NotFound desc = could not find container \"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363\": container with ID starting with 867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363 not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.951964 4804 scope.go:117] "RemoveContainer" containerID="cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.952143 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855"} err="failed to get container status \"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855\": rpc error: code = NotFound desc = could not find container \"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855\": container with ID starting with cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855 not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.952162 4804 scope.go:117] "RemoveContainer" containerID="1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.975081 4804 scope.go:117] "RemoveContainer" containerID="e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.990988 4804 scope.go:117] "RemoveContainer" containerID="1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0" Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.991567 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0\": container with ID starting with 1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0 not found: ID does not exist" containerID="1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.991632 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0"} err="failed to get container status \"1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0\": rpc error: code = NotFound desc = could not find container \"1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0\": container with ID starting with 1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0 not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.991660 4804 scope.go:117] "RemoveContainer" containerID="e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd" Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.992041 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd\": container with ID starting with e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd not found: ID does not exist" containerID="e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.992073 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd"} err="failed to get container status \"e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd\": rpc error: code = NotFound desc = could not find container \"e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd\": container with ID starting with e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd not found: ID does not exist" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027295 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027350 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39bfc426-b9af-40b4-a713-26bb2366db7a-log-httpd\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027382 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39bfc426-b9af-40b4-a713-26bb2366db7a-run-httpd\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027408 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027549 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-config-data\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027574 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7mzz\" (UniqueName: \"kubernetes.io/projected/39bfc426-b9af-40b4-a713-26bb2366db7a-kube-api-access-l7mzz\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027603 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-config-data\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027665 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-scripts\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027719 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027763 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be0a823-7437-40f0-977e-0ceab74013ea-logs\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027930 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2479\" (UniqueName: \"kubernetes.io/projected/3be0a823-7437-40f0-977e-0ceab74013ea-kube-api-access-m2479\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027987 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.028052 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-public-tls-certs\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.028066 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.129795 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2479\" (UniqueName: \"kubernetes.io/projected/3be0a823-7437-40f0-977e-0ceab74013ea-kube-api-access-m2479\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.129867 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.129901 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-public-tls-certs\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.129932 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.129983 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.130008 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39bfc426-b9af-40b4-a713-26bb2366db7a-log-httpd\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.130034 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39bfc426-b9af-40b4-a713-26bb2366db7a-run-httpd\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.130064 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.130157 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-config-data\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.130187 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7mzz\" (UniqueName: \"kubernetes.io/projected/39bfc426-b9af-40b4-a713-26bb2366db7a-kube-api-access-l7mzz\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.130255 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-config-data\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.130290 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-scripts\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.130343 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.130382 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be0a823-7437-40f0-977e-0ceab74013ea-logs\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.130566 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39bfc426-b9af-40b4-a713-26bb2366db7a-run-httpd\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.130658 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39bfc426-b9af-40b4-a713-26bb2366db7a-log-httpd\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.131020 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be0a823-7437-40f0-977e-0ceab74013ea-logs\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.134225 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-scripts\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.134583 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.134584 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-config-data\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.135969 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-public-tls-certs\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.136172 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.136616 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.144379 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.148162 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-config-data\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.149314 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.152650 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7mzz\" (UniqueName: \"kubernetes.io/projected/39bfc426-b9af-40b4-a713-26bb2366db7a-kube-api-access-l7mzz\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.168776 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2479\" (UniqueName: \"kubernetes.io/projected/3be0a823-7437-40f0-977e-0ceab74013ea-kube-api-access-m2479\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.212816 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.266844 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.758440 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.820492 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3be0a823-7437-40f0-977e-0ceab74013ea","Type":"ContainerStarted","Data":"5ed23a96a0046c231b096d9ec7822cad0493113fc209be6092efb93ac3aeb1f1"} Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.835086 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:22 crc kubenswrapper[4804]: I0217 13:49:22.059167 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:22 crc kubenswrapper[4804]: I0217 13:49:22.089152 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:22 crc kubenswrapper[4804]: I0217 13:49:22.583567 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" path="/var/lib/kubelet/pods/2d46aa4d-a4d9-4376-8c8f-2dee489f4662/volumes" Feb 17 13:49:22 crc kubenswrapper[4804]: I0217 13:49:22.584702 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" path="/var/lib/kubelet/pods/8813e275-f23d-497a-a085-0ac6e26ab8c0/volumes" Feb 17 13:49:22 crc kubenswrapper[4804]: I0217 13:49:22.838064 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3be0a823-7437-40f0-977e-0ceab74013ea","Type":"ContainerStarted","Data":"db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab"} Feb 17 13:49:22 crc kubenswrapper[4804]: I0217 13:49:22.838100 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3be0a823-7437-40f0-977e-0ceab74013ea","Type":"ContainerStarted","Data":"93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d"} Feb 17 13:49:22 crc kubenswrapper[4804]: I0217 13:49:22.840041 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39bfc426-b9af-40b4-a713-26bb2366db7a","Type":"ContainerStarted","Data":"9690be6c1002861bf0de390b3bd8f555e7e19daaea327060a786b5824e8f9b73"} Feb 17 13:49:22 crc kubenswrapper[4804]: I0217 13:49:22.840078 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39bfc426-b9af-40b4-a713-26bb2366db7a","Type":"ContainerStarted","Data":"e45c40f839a3f2cb44fd1aa6071f5e73e3e01da151a9e2b51aad903b7284b659"} Feb 17 13:49:22 crc kubenswrapper[4804]: I0217 13:49:22.858256 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.8582378029999997 podStartE2EDuration="2.858237803s" podCreationTimestamp="2026-02-17 13:49:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:49:22.85560921 +0000 UTC m=+1436.967028567" watchObservedRunningTime="2026-02-17 13:49:22.858237803 +0000 UTC m=+1436.969657140" Feb 17 13:49:22 crc kubenswrapper[4804]: I0217 13:49:22.860699 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.007443 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-s8qtz"] Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.009246 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.011442 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.011601 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.014286 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-s8qtz"] Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.189433 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.189605 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-scripts\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.189650 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-config-data\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.189684 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f5d8\" (UniqueName: \"kubernetes.io/projected/0b6d06cb-8252-4c27-815b-1f09a217cbb4-kube-api-access-8f5d8\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.291691 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.291828 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-scripts\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.291884 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-config-data\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.291917 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f5d8\" (UniqueName: \"kubernetes.io/projected/0b6d06cb-8252-4c27-815b-1f09a217cbb4-kube-api-access-8f5d8\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.295444 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-scripts\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.296928 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.297385 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-config-data\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.313980 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f5d8\" (UniqueName: \"kubernetes.io/projected/0b6d06cb-8252-4c27-815b-1f09a217cbb4-kube-api-access-8f5d8\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.327406 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: W0217 13:49:23.757328 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b6d06cb_8252_4c27_815b_1f09a217cbb4.slice/crio-9e18f0732edbd50ab3a8f7a1d0aecc6c93c903f530d5e5741b73a3b028b8726e WatchSource:0}: Error finding container 9e18f0732edbd50ab3a8f7a1d0aecc6c93c903f530d5e5741b73a3b028b8726e: Status 404 returned error can't find the container with id 9e18f0732edbd50ab3a8f7a1d0aecc6c93c903f530d5e5741b73a3b028b8726e Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.763640 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-s8qtz"] Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.850394 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-s8qtz" event={"ID":"0b6d06cb-8252-4c27-815b-1f09a217cbb4","Type":"ContainerStarted","Data":"9e18f0732edbd50ab3a8f7a1d0aecc6c93c903f530d5e5741b73a3b028b8726e"} Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.853602 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39bfc426-b9af-40b4-a713-26bb2366db7a","Type":"ContainerStarted","Data":"b6ecb0fb21c7e0514abe1a06cb934090aae05aa720515eef411bf85a5f5bd522"} Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.235311 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.309255 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-nm74r"] Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.309558 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" podUID="59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" containerName="dnsmasq-dns" containerID="cri-o://80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2" gracePeriod=10 Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.831241 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.868171 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-s8qtz" event={"ID":"0b6d06cb-8252-4c27-815b-1f09a217cbb4","Type":"ContainerStarted","Data":"3edaad49062f52adf5c7194a9baff45d7b6f8571728650127dd710028add6529"} Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.873053 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39bfc426-b9af-40b4-a713-26bb2366db7a","Type":"ContainerStarted","Data":"142a6be816a205390c7d052cb7f3cc8b7e1d745dd99eb99bbf503bec3bc6c60f"} Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.883105 4804 generic.go:334] "Generic (PLEG): container finished" podID="59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" containerID="80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2" exitCode=0 Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.883147 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" event={"ID":"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70","Type":"ContainerDied","Data":"80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2"} Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.883172 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" event={"ID":"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70","Type":"ContainerDied","Data":"9ac5534fef55ed02d86af4d8912cb72f23f77c2e384ce39f866abb0e39f803e5"} Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.883187 4804 scope.go:117] "RemoveContainer" containerID="80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2" Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.883686 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.890904 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-s8qtz" podStartSLOduration=2.890888857 podStartE2EDuration="2.890888857s" podCreationTimestamp="2026-02-17 13:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:49:24.885327012 +0000 UTC m=+1438.996746349" watchObservedRunningTime="2026-02-17 13:49:24.890888857 +0000 UTC m=+1439.002308194" Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.911639 4804 scope.go:117] "RemoveContainer" containerID="11ccb1e04351ba07a549d61417358dec2218d53f646147b72efa0ee2b3fc4654" Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.940722 4804 scope.go:117] "RemoveContainer" containerID="80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2" Feb 17 13:49:24 crc kubenswrapper[4804]: E0217 13:49:24.942506 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2\": container with ID starting with 80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2 not found: ID does not exist" containerID="80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2" Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.942545 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2"} err="failed to get container status \"80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2\": rpc error: code = NotFound desc = could not find container \"80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2\": container with ID starting with 80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2 not found: ID does not exist" Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.942567 4804 scope.go:117] "RemoveContainer" containerID="11ccb1e04351ba07a549d61417358dec2218d53f646147b72efa0ee2b3fc4654" Feb 17 13:49:24 crc kubenswrapper[4804]: E0217 13:49:24.942913 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11ccb1e04351ba07a549d61417358dec2218d53f646147b72efa0ee2b3fc4654\": container with ID starting with 11ccb1e04351ba07a549d61417358dec2218d53f646147b72efa0ee2b3fc4654 not found: ID does not exist" containerID="11ccb1e04351ba07a549d61417358dec2218d53f646147b72efa0ee2b3fc4654" Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.942951 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11ccb1e04351ba07a549d61417358dec2218d53f646147b72efa0ee2b3fc4654"} err="failed to get container status \"11ccb1e04351ba07a549d61417358dec2218d53f646147b72efa0ee2b3fc4654\": rpc error: code = NotFound desc = could not find container \"11ccb1e04351ba07a549d61417358dec2218d53f646147b72efa0ee2b3fc4654\": container with ID starting with 11ccb1e04351ba07a549d61417358dec2218d53f646147b72efa0ee2b3fc4654 not found: ID does not exist" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.028589 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-config\") pod \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.029272 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-swift-storage-0\") pod \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.029363 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-nb\") pod \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.029409 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqn6x\" (UniqueName: \"kubernetes.io/projected/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-kube-api-access-dqn6x\") pod \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.029486 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-sb\") pod \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.029540 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-svc\") pod \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.066211 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-kube-api-access-dqn6x" (OuterVolumeSpecName: "kube-api-access-dqn6x") pod "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" (UID: "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70"). InnerVolumeSpecName "kube-api-access-dqn6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.084011 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-config" (OuterVolumeSpecName: "config") pod "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" (UID: "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.101093 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" (UID: "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.110501 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" (UID: "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.119254 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" (UID: "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.123903 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" (UID: "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.158521 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.158567 4804 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.158582 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.158592 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqn6x\" (UniqueName: \"kubernetes.io/projected/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-kube-api-access-dqn6x\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.158603 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.158612 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.218733 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-nm74r"] Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.226976 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-nm74r"] Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.835352 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.835742 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:49:26 crc kubenswrapper[4804]: I0217 13:49:26.586130 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" path="/var/lib/kubelet/pods/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70/volumes" Feb 17 13:49:26 crc kubenswrapper[4804]: I0217 13:49:26.912409 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39bfc426-b9af-40b4-a713-26bb2366db7a","Type":"ContainerStarted","Data":"bae793acfed60a7d791f396224f96d981c0c0d8afab0f42ff46c61d5cf3045c4"} Feb 17 13:49:26 crc kubenswrapper[4804]: I0217 13:49:26.912808 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 13:49:26 crc kubenswrapper[4804]: I0217 13:49:26.935142 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.783351529 podStartE2EDuration="6.935124804s" podCreationTimestamp="2026-02-17 13:49:20 +0000 UTC" firstStartedPulling="2026-02-17 13:49:21.831899462 +0000 UTC m=+1435.943318809" lastFinishedPulling="2026-02-17 13:49:25.983672747 +0000 UTC m=+1440.095092084" observedRunningTime="2026-02-17 13:49:26.933111782 +0000 UTC m=+1441.044531129" watchObservedRunningTime="2026-02-17 13:49:26.935124804 +0000 UTC m=+1441.046544141" Feb 17 13:49:28 crc kubenswrapper[4804]: I0217 13:49:28.937109 4804 generic.go:334] "Generic (PLEG): container finished" podID="0b6d06cb-8252-4c27-815b-1f09a217cbb4" containerID="3edaad49062f52adf5c7194a9baff45d7b6f8571728650127dd710028add6529" exitCode=0 Feb 17 13:49:28 crc kubenswrapper[4804]: I0217 13:49:28.937220 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-s8qtz" event={"ID":"0b6d06cb-8252-4c27-815b-1f09a217cbb4","Type":"ContainerDied","Data":"3edaad49062f52adf5c7194a9baff45d7b6f8571728650127dd710028add6529"} Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.279561 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.349803 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-combined-ca-bundle\") pod \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.349895 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f5d8\" (UniqueName: \"kubernetes.io/projected/0b6d06cb-8252-4c27-815b-1f09a217cbb4-kube-api-access-8f5d8\") pod \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.349942 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-scripts\") pod \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.350013 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-config-data\") pod \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.355322 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b6d06cb-8252-4c27-815b-1f09a217cbb4-kube-api-access-8f5d8" (OuterVolumeSpecName: "kube-api-access-8f5d8") pod "0b6d06cb-8252-4c27-815b-1f09a217cbb4" (UID: "0b6d06cb-8252-4c27-815b-1f09a217cbb4"). InnerVolumeSpecName "kube-api-access-8f5d8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.359890 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-scripts" (OuterVolumeSpecName: "scripts") pod "0b6d06cb-8252-4c27-815b-1f09a217cbb4" (UID: "0b6d06cb-8252-4c27-815b-1f09a217cbb4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.378537 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b6d06cb-8252-4c27-815b-1f09a217cbb4" (UID: "0b6d06cb-8252-4c27-815b-1f09a217cbb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.380698 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-config-data" (OuterVolumeSpecName: "config-data") pod "0b6d06cb-8252-4c27-815b-1f09a217cbb4" (UID: "0b6d06cb-8252-4c27-815b-1f09a217cbb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.452366 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.452491 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f5d8\" (UniqueName: \"kubernetes.io/projected/0b6d06cb-8252-4c27-815b-1f09a217cbb4-kube-api-access-8f5d8\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.452509 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.452520 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.961462 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-s8qtz" event={"ID":"0b6d06cb-8252-4c27-815b-1f09a217cbb4","Type":"ContainerDied","Data":"9e18f0732edbd50ab3a8f7a1d0aecc6c93c903f530d5e5741b73a3b028b8726e"} Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.961524 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e18f0732edbd50ab3a8f7a1d0aecc6c93c903f530d5e5741b73a3b028b8726e" Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.961581 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.153327 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.154246 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3be0a823-7437-40f0-977e-0ceab74013ea" containerName="nova-api-log" containerID="cri-o://93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d" gracePeriod=30 Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.154319 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3be0a823-7437-40f0-977e-0ceab74013ea" containerName="nova-api-api" containerID="cri-o://db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab" gracePeriod=30 Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.165412 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.165659 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="abf6d166-4c3f-4fb3-a3b5-f85d47adf823" containerName="nova-scheduler-scheduler" containerID="cri-o://74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a" gracePeriod=30 Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.200125 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.200416 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-log" containerID="cri-o://10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679" gracePeriod=30 Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.200491 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-metadata" containerID="cri-o://f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5" gracePeriod=30 Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.741455 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.779111 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2479\" (UniqueName: \"kubernetes.io/projected/3be0a823-7437-40f0-977e-0ceab74013ea-kube-api-access-m2479\") pod \"3be0a823-7437-40f0-977e-0ceab74013ea\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.779191 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-config-data\") pod \"3be0a823-7437-40f0-977e-0ceab74013ea\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.779268 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-public-tls-certs\") pod \"3be0a823-7437-40f0-977e-0ceab74013ea\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.779354 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be0a823-7437-40f0-977e-0ceab74013ea-logs\") pod \"3be0a823-7437-40f0-977e-0ceab74013ea\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.779392 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-combined-ca-bundle\") pod \"3be0a823-7437-40f0-977e-0ceab74013ea\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.779429 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-internal-tls-certs\") pod \"3be0a823-7437-40f0-977e-0ceab74013ea\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.794247 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3be0a823-7437-40f0-977e-0ceab74013ea-logs" (OuterVolumeSpecName: "logs") pod "3be0a823-7437-40f0-977e-0ceab74013ea" (UID: "3be0a823-7437-40f0-977e-0ceab74013ea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.794526 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3be0a823-7437-40f0-977e-0ceab74013ea-kube-api-access-m2479" (OuterVolumeSpecName: "kube-api-access-m2479") pod "3be0a823-7437-40f0-977e-0ceab74013ea" (UID: "3be0a823-7437-40f0-977e-0ceab74013ea"). InnerVolumeSpecName "kube-api-access-m2479". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.824713 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3be0a823-7437-40f0-977e-0ceab74013ea" (UID: "3be0a823-7437-40f0-977e-0ceab74013ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.826958 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-config-data" (OuterVolumeSpecName: "config-data") pod "3be0a823-7437-40f0-977e-0ceab74013ea" (UID: "3be0a823-7437-40f0-977e-0ceab74013ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.841476 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3be0a823-7437-40f0-977e-0ceab74013ea" (UID: "3be0a823-7437-40f0-977e-0ceab74013ea"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.859506 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3be0a823-7437-40f0-977e-0ceab74013ea" (UID: "3be0a823-7437-40f0-977e-0ceab74013ea"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.881770 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2479\" (UniqueName: \"kubernetes.io/projected/3be0a823-7437-40f0-977e-0ceab74013ea-kube-api-access-m2479\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.881810 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.881824 4804 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.881839 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be0a823-7437-40f0-977e-0ceab74013ea-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.881851 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.881861 4804 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.971999 4804 generic.go:334] "Generic (PLEG): container finished" podID="3be0a823-7437-40f0-977e-0ceab74013ea" containerID="db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab" exitCode=0 Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.972030 4804 generic.go:334] "Generic (PLEG): container finished" podID="3be0a823-7437-40f0-977e-0ceab74013ea" containerID="93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d" exitCode=143 Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.972082 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3be0a823-7437-40f0-977e-0ceab74013ea","Type":"ContainerDied","Data":"db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab"} Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.972115 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3be0a823-7437-40f0-977e-0ceab74013ea","Type":"ContainerDied","Data":"93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d"} Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.972131 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3be0a823-7437-40f0-977e-0ceab74013ea","Type":"ContainerDied","Data":"5ed23a96a0046c231b096d9ec7822cad0493113fc209be6092efb93ac3aeb1f1"} Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.972143 4804 scope.go:117] "RemoveContainer" containerID="db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.972534 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.978260 4804 generic.go:334] "Generic (PLEG): container finished" podID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerID="10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679" exitCode=143 Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.978298 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aa87191a-671d-43c8-b8c2-e5e07a54af02","Type":"ContainerDied","Data":"10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679"} Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.993262 4804 scope.go:117] "RemoveContainer" containerID="93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.016544 4804 scope.go:117] "RemoveContainer" containerID="db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab" Feb 17 13:49:32 crc kubenswrapper[4804]: E0217 13:49:32.016835 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab\": container with ID starting with db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab not found: ID does not exist" containerID="db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.016868 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab"} err="failed to get container status \"db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab\": rpc error: code = NotFound desc = could not find container \"db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab\": container with ID starting with db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab not found: ID does not exist" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.016888 4804 scope.go:117] "RemoveContainer" containerID="93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d" Feb 17 13:49:32 crc kubenswrapper[4804]: E0217 13:49:32.017071 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d\": container with ID starting with 93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d not found: ID does not exist" containerID="93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.017094 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d"} err="failed to get container status \"93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d\": rpc error: code = NotFound desc = could not find container \"93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d\": container with ID starting with 93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d not found: ID does not exist" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.017106 4804 scope.go:117] "RemoveContainer" containerID="db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.017291 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab"} err="failed to get container status \"db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab\": rpc error: code = NotFound desc = could not find container \"db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab\": container with ID starting with db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab not found: ID does not exist" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.017305 4804 scope.go:117] "RemoveContainer" containerID="93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.017466 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d"} err="failed to get container status \"93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d\": rpc error: code = NotFound desc = could not find container \"93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d\": container with ID starting with 93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d not found: ID does not exist" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.034653 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.052977 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.066006 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:32 crc kubenswrapper[4804]: E0217 13:49:32.066541 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b6d06cb-8252-4c27-815b-1f09a217cbb4" containerName="nova-manage" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.066568 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b6d06cb-8252-4c27-815b-1f09a217cbb4" containerName="nova-manage" Feb 17 13:49:32 crc kubenswrapper[4804]: E0217 13:49:32.066583 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" containerName="init" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.066592 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" containerName="init" Feb 17 13:49:32 crc kubenswrapper[4804]: E0217 13:49:32.066607 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be0a823-7437-40f0-977e-0ceab74013ea" containerName="nova-api-api" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.066615 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be0a823-7437-40f0-977e-0ceab74013ea" containerName="nova-api-api" Feb 17 13:49:32 crc kubenswrapper[4804]: E0217 13:49:32.066647 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be0a823-7437-40f0-977e-0ceab74013ea" containerName="nova-api-log" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.066655 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be0a823-7437-40f0-977e-0ceab74013ea" containerName="nova-api-log" Feb 17 13:49:32 crc kubenswrapper[4804]: E0217 13:49:32.066664 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" containerName="dnsmasq-dns" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.066670 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" containerName="dnsmasq-dns" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.066890 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="3be0a823-7437-40f0-977e-0ceab74013ea" containerName="nova-api-api" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.066912 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b6d06cb-8252-4c27-815b-1f09a217cbb4" containerName="nova-manage" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.066929 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" containerName="dnsmasq-dns" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.066937 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="3be0a823-7437-40f0-977e-0ceab74013ea" containerName="nova-api-log" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.067906 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.070778 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.071463 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.071662 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.071866 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.085789 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-config-data\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.085905 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6htr8\" (UniqueName: \"kubernetes.io/projected/29528202-42d5-4bcd-90e8-335435ba59cf-kube-api-access-6htr8\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.085963 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.086005 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29528202-42d5-4bcd-90e8-335435ba59cf-logs\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.086046 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-public-tls-certs\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.086104 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-internal-tls-certs\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.187566 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6htr8\" (UniqueName: \"kubernetes.io/projected/29528202-42d5-4bcd-90e8-335435ba59cf-kube-api-access-6htr8\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.187643 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.187680 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29528202-42d5-4bcd-90e8-335435ba59cf-logs\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.187716 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-public-tls-certs\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.187737 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-internal-tls-certs\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.187755 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-config-data\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.188536 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29528202-42d5-4bcd-90e8-335435ba59cf-logs\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.191572 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-internal-tls-certs\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.192178 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-config-data\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.192756 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.192887 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-public-tls-certs\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.207160 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6htr8\" (UniqueName: \"kubernetes.io/projected/29528202-42d5-4bcd-90e8-335435ba59cf-kube-api-access-6htr8\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.394184 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.588568 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3be0a823-7437-40f0-977e-0ceab74013ea" path="/var/lib/kubelet/pods/3be0a823-7437-40f0-977e-0ceab74013ea/volumes" Feb 17 13:49:32 crc kubenswrapper[4804]: W0217 13:49:32.882779 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29528202_42d5_4bcd_90e8_335435ba59cf.slice/crio-7c3ef51a6bde88531304f52bedab6600736ab2313ce3ef16aeb7d0cee489a0b9 WatchSource:0}: Error finding container 7c3ef51a6bde88531304f52bedab6600736ab2313ce3ef16aeb7d0cee489a0b9: Status 404 returned error can't find the container with id 7c3ef51a6bde88531304f52bedab6600736ab2313ce3ef16aeb7d0cee489a0b9 Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.888005 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.990420 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29528202-42d5-4bcd-90e8-335435ba59cf","Type":"ContainerStarted","Data":"7c3ef51a6bde88531304f52bedab6600736ab2313ce3ef16aeb7d0cee489a0b9"} Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.001477 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29528202-42d5-4bcd-90e8-335435ba59cf","Type":"ContainerStarted","Data":"d31055369e630c0356125fed33078d98b08a03ad8963cdb41e62d2a70ef41392"} Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.001788 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29528202-42d5-4bcd-90e8-335435ba59cf","Type":"ContainerStarted","Data":"ba5b77b47bab0c422fcd3b22593d9afb83fc8548133092014d8241ba7f652aae"} Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.031701 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.031677995 podStartE2EDuration="2.031677995s" podCreationTimestamp="2026-02-17 13:49:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:49:34.019483231 +0000 UTC m=+1448.130902568" watchObservedRunningTime="2026-02-17 13:49:34.031677995 +0000 UTC m=+1448.143097332" Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.353833 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:48052->10.217.0.197:8775: read: connection reset by peer" Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.353834 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:48044->10.217.0.197:8775: read: connection reset by peer" Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.835029 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.936682 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-config-data\") pod \"aa87191a-671d-43c8-b8c2-e5e07a54af02\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.936753 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa87191a-671d-43c8-b8c2-e5e07a54af02-logs\") pod \"aa87191a-671d-43c8-b8c2-e5e07a54af02\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.936772 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-combined-ca-bundle\") pod \"aa87191a-671d-43c8-b8c2-e5e07a54af02\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.936796 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqplt\" (UniqueName: \"kubernetes.io/projected/aa87191a-671d-43c8-b8c2-e5e07a54af02-kube-api-access-rqplt\") pod \"aa87191a-671d-43c8-b8c2-e5e07a54af02\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.936853 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-nova-metadata-tls-certs\") pod \"aa87191a-671d-43c8-b8c2-e5e07a54af02\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.937401 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa87191a-671d-43c8-b8c2-e5e07a54af02-logs" (OuterVolumeSpecName: "logs") pod "aa87191a-671d-43c8-b8c2-e5e07a54af02" (UID: "aa87191a-671d-43c8-b8c2-e5e07a54af02"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.946149 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa87191a-671d-43c8-b8c2-e5e07a54af02-kube-api-access-rqplt" (OuterVolumeSpecName: "kube-api-access-rqplt") pod "aa87191a-671d-43c8-b8c2-e5e07a54af02" (UID: "aa87191a-671d-43c8-b8c2-e5e07a54af02"). InnerVolumeSpecName "kube-api-access-rqplt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.968623 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-config-data" (OuterVolumeSpecName: "config-data") pod "aa87191a-671d-43c8-b8c2-e5e07a54af02" (UID: "aa87191a-671d-43c8-b8c2-e5e07a54af02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.971598 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa87191a-671d-43c8-b8c2-e5e07a54af02" (UID: "aa87191a-671d-43c8-b8c2-e5e07a54af02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.980365 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.994847 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "aa87191a-671d-43c8-b8c2-e5e07a54af02" (UID: "aa87191a-671d-43c8-b8c2-e5e07a54af02"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.028302 4804 generic.go:334] "Generic (PLEG): container finished" podID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerID="f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5" exitCode=0 Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.028377 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aa87191a-671d-43c8-b8c2-e5e07a54af02","Type":"ContainerDied","Data":"f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5"} Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.028408 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aa87191a-671d-43c8-b8c2-e5e07a54af02","Type":"ContainerDied","Data":"7c478cf9f4ed2e396f528fdf22823fcf8ddf5f04f1a3c2774ead4eafb4cdd61a"} Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.028429 4804 scope.go:117] "RemoveContainer" containerID="f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.028539 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.033451 4804 generic.go:334] "Generic (PLEG): container finished" podID="abf6d166-4c3f-4fb3-a3b5-f85d47adf823" containerID="74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a" exitCode=0 Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.034354 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.034474 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"abf6d166-4c3f-4fb3-a3b5-f85d47adf823","Type":"ContainerDied","Data":"74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a"} Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.034494 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"abf6d166-4c3f-4fb3-a3b5-f85d47adf823","Type":"ContainerDied","Data":"45d059e86e213177abef9a85b8685f82c73749ae5fde7098a9e718ebf9c0ae93"} Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.038782 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.038813 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa87191a-671d-43c8-b8c2-e5e07a54af02-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.038825 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.038838 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqplt\" (UniqueName: \"kubernetes.io/projected/aa87191a-671d-43c8-b8c2-e5e07a54af02-kube-api-access-rqplt\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.038851 4804 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.053172 4804 scope.go:117] "RemoveContainer" containerID="10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.084815 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.094092 4804 scope.go:117] "RemoveContainer" containerID="f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5" Feb 17 13:49:35 crc kubenswrapper[4804]: E0217 13:49:35.094559 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5\": container with ID starting with f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5 not found: ID does not exist" containerID="f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.094600 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5"} err="failed to get container status \"f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5\": rpc error: code = NotFound desc = could not find container \"f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5\": container with ID starting with f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5 not found: ID does not exist" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.094632 4804 scope.go:117] "RemoveContainer" containerID="10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679" Feb 17 13:49:35 crc kubenswrapper[4804]: E0217 13:49:35.095616 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679\": container with ID starting with 10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679 not found: ID does not exist" containerID="10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.095649 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679"} err="failed to get container status \"10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679\": rpc error: code = NotFound desc = could not find container \"10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679\": container with ID starting with 10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679 not found: ID does not exist" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.095672 4804 scope.go:117] "RemoveContainer" containerID="74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.099631 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.128997 4804 scope.go:117] "RemoveContainer" containerID="74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a" Feb 17 13:49:35 crc kubenswrapper[4804]: E0217 13:49:35.132536 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a\": container with ID starting with 74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a not found: ID does not exist" containerID="74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.132578 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a"} err="failed to get container status \"74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a\": rpc error: code = NotFound desc = could not find container \"74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a\": container with ID starting with 74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a not found: ID does not exist" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.142889 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-combined-ca-bundle\") pod \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.142952 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f65hb\" (UniqueName: \"kubernetes.io/projected/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-kube-api-access-f65hb\") pod \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.143027 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-config-data\") pod \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.154374 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-kube-api-access-f65hb" (OuterVolumeSpecName: "kube-api-access-f65hb") pod "abf6d166-4c3f-4fb3-a3b5-f85d47adf823" (UID: "abf6d166-4c3f-4fb3-a3b5-f85d47adf823"). InnerVolumeSpecName "kube-api-access-f65hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.156862 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:49:35 crc kubenswrapper[4804]: E0217 13:49:35.157921 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-log" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.157966 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-log" Feb 17 13:49:35 crc kubenswrapper[4804]: E0217 13:49:35.157977 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-metadata" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.157984 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-metadata" Feb 17 13:49:35 crc kubenswrapper[4804]: E0217 13:49:35.158011 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abf6d166-4c3f-4fb3-a3b5-f85d47adf823" containerName="nova-scheduler-scheduler" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.158039 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="abf6d166-4c3f-4fb3-a3b5-f85d47adf823" containerName="nova-scheduler-scheduler" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.158789 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-metadata" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.158825 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-log" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.158837 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="abf6d166-4c3f-4fb3-a3b5-f85d47adf823" containerName="nova-scheduler-scheduler" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.160731 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.162956 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.163009 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.166774 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abf6d166-4c3f-4fb3-a3b5-f85d47adf823" (UID: "abf6d166-4c3f-4fb3-a3b5-f85d47adf823"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.170508 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.186152 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-config-data" (OuterVolumeSpecName: "config-data") pod "abf6d166-4c3f-4fb3-a3b5-f85d47adf823" (UID: "abf6d166-4c3f-4fb3-a3b5-f85d47adf823"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.245378 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.245411 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f65hb\" (UniqueName: \"kubernetes.io/projected/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-kube-api-access-f65hb\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.245421 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.347425 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.347492 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.347828 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95qwz\" (UniqueName: \"kubernetes.io/projected/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-kube-api-access-95qwz\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.347953 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-logs\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.348113 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-config-data\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.374935 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.397806 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.404445 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.406129 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.409287 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.415858 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.449909 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95qwz\" (UniqueName: \"kubernetes.io/projected/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-kube-api-access-95qwz\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.449965 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-logs\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.450004 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-config-data\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.450054 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.450077 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.450808 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-logs\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.454332 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.454425 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-config-data\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.455938 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.465303 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95qwz\" (UniqueName: \"kubernetes.io/projected/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-kube-api-access-95qwz\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.478483 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.552210 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4zqm\" (UniqueName: \"kubernetes.io/projected/1bac289d-58a7-4e23-8805-c48811d12d32-kube-api-access-d4zqm\") pod \"nova-scheduler-0\" (UID: \"1bac289d-58a7-4e23-8805-c48811d12d32\") " pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.552287 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bac289d-58a7-4e23-8805-c48811d12d32-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1bac289d-58a7-4e23-8805-c48811d12d32\") " pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.552350 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bac289d-58a7-4e23-8805-c48811d12d32-config-data\") pod \"nova-scheduler-0\" (UID: \"1bac289d-58a7-4e23-8805-c48811d12d32\") " pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.655389 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4zqm\" (UniqueName: \"kubernetes.io/projected/1bac289d-58a7-4e23-8805-c48811d12d32-kube-api-access-d4zqm\") pod \"nova-scheduler-0\" (UID: \"1bac289d-58a7-4e23-8805-c48811d12d32\") " pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.655752 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bac289d-58a7-4e23-8805-c48811d12d32-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1bac289d-58a7-4e23-8805-c48811d12d32\") " pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.655834 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bac289d-58a7-4e23-8805-c48811d12d32-config-data\") pod \"nova-scheduler-0\" (UID: \"1bac289d-58a7-4e23-8805-c48811d12d32\") " pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.669538 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bac289d-58a7-4e23-8805-c48811d12d32-config-data\") pod \"nova-scheduler-0\" (UID: \"1bac289d-58a7-4e23-8805-c48811d12d32\") " pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.671410 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bac289d-58a7-4e23-8805-c48811d12d32-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1bac289d-58a7-4e23-8805-c48811d12d32\") " pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.676996 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4zqm\" (UniqueName: \"kubernetes.io/projected/1bac289d-58a7-4e23-8805-c48811d12d32-kube-api-access-d4zqm\") pod \"nova-scheduler-0\" (UID: \"1bac289d-58a7-4e23-8805-c48811d12d32\") " pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.871845 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.949131 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:49:36 crc kubenswrapper[4804]: I0217 13:49:36.055104 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39","Type":"ContainerStarted","Data":"3cf3bf2971bbe3ac1ef8d149181e32f777d990eb9bd36af317d8f33ac12551c2"} Feb 17 13:49:36 crc kubenswrapper[4804]: I0217 13:49:36.297746 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:49:36 crc kubenswrapper[4804]: W0217 13:49:36.300750 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bac289d_58a7_4e23_8805_c48811d12d32.slice/crio-e2772258ef70f8993eed183548129d8e4a515ce477a60fa2023f890b7216f2fb WatchSource:0}: Error finding container e2772258ef70f8993eed183548129d8e4a515ce477a60fa2023f890b7216f2fb: Status 404 returned error can't find the container with id e2772258ef70f8993eed183548129d8e4a515ce477a60fa2023f890b7216f2fb Feb 17 13:49:36 crc kubenswrapper[4804]: I0217 13:49:36.595246 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" path="/var/lib/kubelet/pods/aa87191a-671d-43c8-b8c2-e5e07a54af02/volumes" Feb 17 13:49:36 crc kubenswrapper[4804]: I0217 13:49:36.596886 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abf6d166-4c3f-4fb3-a3b5-f85d47adf823" path="/var/lib/kubelet/pods/abf6d166-4c3f-4fb3-a3b5-f85d47adf823/volumes" Feb 17 13:49:37 crc kubenswrapper[4804]: I0217 13:49:37.069254 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39","Type":"ContainerStarted","Data":"a4494d51ea372199bc15d386c17bb86f13e7d289216155b8a43f96e65e292a84"} Feb 17 13:49:37 crc kubenswrapper[4804]: I0217 13:49:37.069299 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39","Type":"ContainerStarted","Data":"eb07a7999853b3bec5d813d4c50bac369e3d9e4b6610762803ac28d7fb7b54bf"} Feb 17 13:49:37 crc kubenswrapper[4804]: I0217 13:49:37.072558 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1bac289d-58a7-4e23-8805-c48811d12d32","Type":"ContainerStarted","Data":"dbf6c34cab2daa4c39572582f1eca7e5d0a1054b839d806faab4284e194858f0"} Feb 17 13:49:37 crc kubenswrapper[4804]: I0217 13:49:37.072582 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1bac289d-58a7-4e23-8805-c48811d12d32","Type":"ContainerStarted","Data":"e2772258ef70f8993eed183548129d8e4a515ce477a60fa2023f890b7216f2fb"} Feb 17 13:49:37 crc kubenswrapper[4804]: I0217 13:49:37.099272 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.099254039 podStartE2EDuration="2.099254039s" podCreationTimestamp="2026-02-17 13:49:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:49:37.091249708 +0000 UTC m=+1451.202669045" watchObservedRunningTime="2026-02-17 13:49:37.099254039 +0000 UTC m=+1451.210673376" Feb 17 13:49:37 crc kubenswrapper[4804]: I0217 13:49:37.123373 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.123346497 podStartE2EDuration="2.123346497s" podCreationTimestamp="2026-02-17 13:49:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:49:37.114249031 +0000 UTC m=+1451.225668408" watchObservedRunningTime="2026-02-17 13:49:37.123346497 +0000 UTC m=+1451.234765874" Feb 17 13:49:40 crc kubenswrapper[4804]: I0217 13:49:40.479705 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 13:49:40 crc kubenswrapper[4804]: I0217 13:49:40.480590 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 13:49:40 crc kubenswrapper[4804]: I0217 13:49:40.872867 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 13:49:42 crc kubenswrapper[4804]: I0217 13:49:42.395173 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 13:49:42 crc kubenswrapper[4804]: I0217 13:49:42.395284 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 13:49:43 crc kubenswrapper[4804]: I0217 13:49:43.410526 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="29528202-42d5-4bcd-90e8-335435ba59cf" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 13:49:43 crc kubenswrapper[4804]: I0217 13:49:43.410637 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="29528202-42d5-4bcd-90e8-335435ba59cf" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 13:49:45 crc kubenswrapper[4804]: I0217 13:49:45.478990 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 13:49:45 crc kubenswrapper[4804]: I0217 13:49:45.480740 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 13:49:45 crc kubenswrapper[4804]: I0217 13:49:45.872754 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 13:49:45 crc kubenswrapper[4804]: I0217 13:49:45.917965 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 13:49:46 crc kubenswrapper[4804]: I0217 13:49:46.214471 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 13:49:46 crc kubenswrapper[4804]: I0217 13:49:46.495474 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ee4c15c1-5fb0-4605-9cb8-69a060ec0d39" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 13:49:46 crc kubenswrapper[4804]: I0217 13:49:46.495473 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ee4c15c1-5fb0-4605-9cb8-69a060ec0d39" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 13:49:51 crc kubenswrapper[4804]: I0217 13:49:51.283779 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 17 13:49:52 crc kubenswrapper[4804]: I0217 13:49:52.428292 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 13:49:52 crc kubenswrapper[4804]: I0217 13:49:52.428801 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 13:49:52 crc kubenswrapper[4804]: I0217 13:49:52.433910 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 13:49:52 crc kubenswrapper[4804]: I0217 13:49:52.436062 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 13:49:53 crc kubenswrapper[4804]: I0217 13:49:53.235106 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 13:49:53 crc kubenswrapper[4804]: I0217 13:49:53.241247 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 13:49:55 crc kubenswrapper[4804]: I0217 13:49:55.486534 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 13:49:55 crc kubenswrapper[4804]: I0217 13:49:55.486950 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 13:49:55 crc kubenswrapper[4804]: I0217 13:49:55.505926 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 13:49:55 crc kubenswrapper[4804]: I0217 13:49:55.512961 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 13:49:55 crc kubenswrapper[4804]: I0217 13:49:55.835918 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:49:55 crc kubenswrapper[4804]: I0217 13:49:55.836021 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:49:55 crc kubenswrapper[4804]: I0217 13:49:55.836100 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:49:55 crc kubenswrapper[4804]: I0217 13:49:55.837169 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"845afb2d1d32c8f1c4420bf9c6d30ae92d7fd53810dea6b094c1e266f88044e6"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 13:49:55 crc kubenswrapper[4804]: I0217 13:49:55.837312 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://845afb2d1d32c8f1c4420bf9c6d30ae92d7fd53810dea6b094c1e266f88044e6" gracePeriod=600 Feb 17 13:49:56 crc kubenswrapper[4804]: I0217 13:49:56.271834 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="845afb2d1d32c8f1c4420bf9c6d30ae92d7fd53810dea6b094c1e266f88044e6" exitCode=0 Feb 17 13:49:56 crc kubenswrapper[4804]: I0217 13:49:56.271913 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"845afb2d1d32c8f1c4420bf9c6d30ae92d7fd53810dea6b094c1e266f88044e6"} Feb 17 13:49:56 crc kubenswrapper[4804]: I0217 13:49:56.272510 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687"} Feb 17 13:49:56 crc kubenswrapper[4804]: I0217 13:49:56.272557 4804 scope.go:117] "RemoveContainer" containerID="ea496523757895e32a1aaa278701aae787ee0314e5bf63e36e8c688fc2dbc0d7" Feb 17 13:50:05 crc kubenswrapper[4804]: I0217 13:50:05.796772 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 13:50:07 crc kubenswrapper[4804]: I0217 13:50:07.317788 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 13:50:09 crc kubenswrapper[4804]: I0217 13:50:09.892044 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="7705a06d-bc27-4686-9ca4-4aae248ead07" containerName="rabbitmq" containerID="cri-o://d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3" gracePeriod=604796 Feb 17 13:50:11 crc kubenswrapper[4804]: I0217 13:50:11.203013 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" containerName="rabbitmq" containerID="cri-o://e223242f2f9a06365d51771062ed7df23bbf7ec9bda6057f41d25fb9aed813cb" gracePeriod=604797 Feb 17 13:50:11 crc kubenswrapper[4804]: I0217 13:50:11.499160 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="7705a06d-bc27-4686-9ca4-4aae248ead07" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.97:5671: connect: connection refused" Feb 17 13:50:11 crc kubenswrapper[4804]: I0217 13:50:11.626389 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.476950 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.493915 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-plugins-conf\") pod \"7705a06d-bc27-4686-9ca4-4aae248ead07\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.493988 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7705a06d-bc27-4686-9ca4-4aae248ead07-pod-info\") pod \"7705a06d-bc27-4686-9ca4-4aae248ead07\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.494135 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-server-conf\") pod \"7705a06d-bc27-4686-9ca4-4aae248ead07\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.494173 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7705a06d-bc27-4686-9ca4-4aae248ead07-erlang-cookie-secret\") pod \"7705a06d-bc27-4686-9ca4-4aae248ead07\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.494296 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"7705a06d-bc27-4686-9ca4-4aae248ead07\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.494378 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-confd\") pod \"7705a06d-bc27-4686-9ca4-4aae248ead07\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.494413 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-erlang-cookie\") pod \"7705a06d-bc27-4686-9ca4-4aae248ead07\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.494465 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-tls\") pod \"7705a06d-bc27-4686-9ca4-4aae248ead07\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.494526 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrqs7\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-kube-api-access-qrqs7\") pod \"7705a06d-bc27-4686-9ca4-4aae248ead07\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.494566 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-config-data\") pod \"7705a06d-bc27-4686-9ca4-4aae248ead07\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.494646 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-plugins\") pod \"7705a06d-bc27-4686-9ca4-4aae248ead07\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.494741 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "7705a06d-bc27-4686-9ca4-4aae248ead07" (UID: "7705a06d-bc27-4686-9ca4-4aae248ead07"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.495213 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "7705a06d-bc27-4686-9ca4-4aae248ead07" (UID: "7705a06d-bc27-4686-9ca4-4aae248ead07"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.495733 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "7705a06d-bc27-4686-9ca4-4aae248ead07" (UID: "7705a06d-bc27-4686-9ca4-4aae248ead07"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.502850 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/7705a06d-bc27-4686-9ca4-4aae248ead07-pod-info" (OuterVolumeSpecName: "pod-info") pod "7705a06d-bc27-4686-9ca4-4aae248ead07" (UID: "7705a06d-bc27-4686-9ca4-4aae248ead07"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.502890 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "7705a06d-bc27-4686-9ca4-4aae248ead07" (UID: "7705a06d-bc27-4686-9ca4-4aae248ead07"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.503120 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-kube-api-access-qrqs7" (OuterVolumeSpecName: "kube-api-access-qrqs7") pod "7705a06d-bc27-4686-9ca4-4aae248ead07" (UID: "7705a06d-bc27-4686-9ca4-4aae248ead07"). InnerVolumeSpecName "kube-api-access-qrqs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.503875 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7705a06d-bc27-4686-9ca4-4aae248ead07-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "7705a06d-bc27-4686-9ca4-4aae248ead07" (UID: "7705a06d-bc27-4686-9ca4-4aae248ead07"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.512608 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "7705a06d-bc27-4686-9ca4-4aae248ead07" (UID: "7705a06d-bc27-4686-9ca4-4aae248ead07"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.576897 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-config-data" (OuterVolumeSpecName: "config-data") pod "7705a06d-bc27-4686-9ca4-4aae248ead07" (UID: "7705a06d-bc27-4686-9ca4-4aae248ead07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.577861 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-server-conf" (OuterVolumeSpecName: "server-conf") pod "7705a06d-bc27-4686-9ca4-4aae248ead07" (UID: "7705a06d-bc27-4686-9ca4-4aae248ead07"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.599271 4804 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.599310 4804 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.599322 4804 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.599331 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrqs7\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-kube-api-access-qrqs7\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.599339 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.599348 4804 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.603046 4804 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.603062 4804 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7705a06d-bc27-4686-9ca4-4aae248ead07-pod-info\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.603101 4804 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-server-conf\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.603113 4804 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7705a06d-bc27-4686-9ca4-4aae248ead07-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.614045 4804 generic.go:334] "Generic (PLEG): container finished" podID="7705a06d-bc27-4686-9ca4-4aae248ead07" containerID="d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3" exitCode=0 Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.614089 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7705a06d-bc27-4686-9ca4-4aae248ead07","Type":"ContainerDied","Data":"d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3"} Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.614095 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.614117 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7705a06d-bc27-4686-9ca4-4aae248ead07","Type":"ContainerDied","Data":"1805a02bed1d8e8fe42a7072ff53aa627c043f3fc1570707e67a0dbc0d5ed7c3"} Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.614136 4804 scope.go:117] "RemoveContainer" containerID="d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.639778 4804 scope.go:117] "RemoveContainer" containerID="762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.660118 4804 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.674240 4804 scope.go:117] "RemoveContainer" containerID="d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3" Feb 17 13:50:16 crc kubenswrapper[4804]: E0217 13:50:16.684335 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3\": container with ID starting with d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3 not found: ID does not exist" containerID="d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.684377 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3"} err="failed to get container status \"d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3\": rpc error: code = NotFound desc = could not find container \"d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3\": container with ID starting with d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3 not found: ID does not exist" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.684400 4804 scope.go:117] "RemoveContainer" containerID="762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993" Feb 17 13:50:16 crc kubenswrapper[4804]: E0217 13:50:16.684818 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993\": container with ID starting with 762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993 not found: ID does not exist" containerID="762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.684836 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993"} err="failed to get container status \"762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993\": rpc error: code = NotFound desc = could not find container \"762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993\": container with ID starting with 762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993 not found: ID does not exist" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.704580 4804 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.710894 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "7705a06d-bc27-4686-9ca4-4aae248ead07" (UID: "7705a06d-bc27-4686-9ca4-4aae248ead07"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.806131 4804 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.969441 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.979323 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.987958 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 13:50:16 crc kubenswrapper[4804]: E0217 13:50:16.988367 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7705a06d-bc27-4686-9ca4-4aae248ead07" containerName="rabbitmq" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.988386 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="7705a06d-bc27-4686-9ca4-4aae248ead07" containerName="rabbitmq" Feb 17 13:50:16 crc kubenswrapper[4804]: E0217 13:50:16.988403 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7705a06d-bc27-4686-9ca4-4aae248ead07" containerName="setup-container" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.988410 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="7705a06d-bc27-4686-9ca4-4aae248ead07" containerName="setup-container" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.988612 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="7705a06d-bc27-4686-9ca4-4aae248ead07" containerName="rabbitmq" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.989598 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.991915 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.992091 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.992250 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.992414 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.995519 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-cxlcf" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.995519 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.995816 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.003534 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.009427 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.009722 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.009846 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b5f204e4-3b7a-4490-9c78-def5bf30f810-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.009964 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5f204e4-3b7a-4490-9c78-def5bf30f810-config-data\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.010050 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.010161 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b5f204e4-3b7a-4490-9c78-def5bf30f810-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.010310 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.010392 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b5f204e4-3b7a-4490-9c78-def5bf30f810-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.010507 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.010583 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lz2n\" (UniqueName: \"kubernetes.io/projected/b5f204e4-3b7a-4490-9c78-def5bf30f810-kube-api-access-2lz2n\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.010663 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b5f204e4-3b7a-4490-9c78-def5bf30f810-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.112640 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.112702 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b5f204e4-3b7a-4490-9c78-def5bf30f810-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.112747 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5f204e4-3b7a-4490-9c78-def5bf30f810-config-data\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.112780 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.112803 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b5f204e4-3b7a-4490-9c78-def5bf30f810-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.112843 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.112914 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b5f204e4-3b7a-4490-9c78-def5bf30f810-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.112980 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.113003 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lz2n\" (UniqueName: \"kubernetes.io/projected/b5f204e4-3b7a-4490-9c78-def5bf30f810-kube-api-access-2lz2n\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.113022 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b5f204e4-3b7a-4490-9c78-def5bf30f810-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.113099 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.113796 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.113899 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5f204e4-3b7a-4490-9c78-def5bf30f810-config-data\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.116366 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.117460 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.118733 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b5f204e4-3b7a-4490-9c78-def5bf30f810-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.119751 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b5f204e4-3b7a-4490-9c78-def5bf30f810-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.119752 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b5f204e4-3b7a-4490-9c78-def5bf30f810-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.119880 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b5f204e4-3b7a-4490-9c78-def5bf30f810-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.121497 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.126396 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.131936 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lz2n\" (UniqueName: \"kubernetes.io/projected/b5f204e4-3b7a-4490-9c78-def5bf30f810-kube-api-access-2lz2n\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.152142 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.323049 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.625760 4804 generic.go:334] "Generic (PLEG): container finished" podID="bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" containerID="e223242f2f9a06365d51771062ed7df23bbf7ec9bda6057f41d25fb9aed813cb" exitCode=0 Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.626007 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad","Type":"ContainerDied","Data":"e223242f2f9a06365d51771062ed7df23bbf7ec9bda6057f41d25fb9aed813cb"} Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.776706 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.856770 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.962873 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxh4q\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-kube-api-access-cxh4q\") pod \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.962937 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-config-data\") pod \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.962974 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.963001 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-erlang-cookie\") pod \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.963027 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-confd\") pod \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.963055 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-plugins-conf\") pod \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.963114 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-plugins\") pod \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.963155 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-pod-info\") pod \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.963179 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-server-conf\") pod \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.963223 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-tls\") pod \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.963263 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-erlang-cookie-secret\") pod \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.964347 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" (UID: "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.964373 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" (UID: "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.964580 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" (UID: "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.968546 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-kube-api-access-cxh4q" (OuterVolumeSpecName: "kube-api-access-cxh4q") pod "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" (UID: "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad"). InnerVolumeSpecName "kube-api-access-cxh4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.969761 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" (UID: "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.970908 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" (UID: "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.971174 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-pod-info" (OuterVolumeSpecName: "pod-info") pod "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" (UID: "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.973281 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" (UID: "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.009542 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-config-data" (OuterVolumeSpecName: "config-data") pod "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" (UID: "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.024973 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-server-conf" (OuterVolumeSpecName: "server-conf") pod "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" (UID: "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.064707 4804 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.064744 4804 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-pod-info\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.064753 4804 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-server-conf\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.064760 4804 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.064769 4804 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.064777 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxh4q\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-kube-api-access-cxh4q\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.064787 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.064811 4804 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.064820 4804 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.064830 4804 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.079994 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" (UID: "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.083420 4804 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.166970 4804 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.167286 4804 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.586321 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7705a06d-bc27-4686-9ca4-4aae248ead07" path="/var/lib/kubelet/pods/7705a06d-bc27-4686-9ca4-4aae248ead07/volumes" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.638720 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b5f204e4-3b7a-4490-9c78-def5bf30f810","Type":"ContainerStarted","Data":"99bb36612b47257c670c0b1cca228d9c758565ee95e73e125105263015fbc589"} Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.641209 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad","Type":"ContainerDied","Data":"3d33f0752018a1f8bfeaf3539a14d45be119615613d9a1b94e290b0a39b198ee"} Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.641266 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.641288 4804 scope.go:117] "RemoveContainer" containerID="e223242f2f9a06365d51771062ed7df23bbf7ec9bda6057f41d25fb9aed813cb" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.673309 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.673974 4804 scope.go:117] "RemoveContainer" containerID="de02dbb74f45601647c918b390d5f93cfff604870702fca3316aca846c6db162" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.702395 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.713502 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 13:50:18 crc kubenswrapper[4804]: E0217 13:50:18.714018 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" containerName="rabbitmq" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.714032 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" containerName="rabbitmq" Feb 17 13:50:18 crc kubenswrapper[4804]: E0217 13:50:18.714045 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" containerName="setup-container" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.714052 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" containerName="setup-container" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.714282 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" containerName="rabbitmq" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.715293 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.723389 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.723434 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.723480 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.724022 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-m99n4" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.724254 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.724428 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.724669 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.728348 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.879693 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c7ecd09-cd15-439d-9153-b55d9013bb83-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.879854 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c7ecd09-cd15-439d-9153-b55d9013bb83-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.879925 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c7ecd09-cd15-439d-9153-b55d9013bb83-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.880064 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c7ecd09-cd15-439d-9153-b55d9013bb83-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.880258 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c7ecd09-cd15-439d-9153-b55d9013bb83-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.880433 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65xc8\" (UniqueName: \"kubernetes.io/projected/4c7ecd09-cd15-439d-9153-b55d9013bb83-kube-api-access-65xc8\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.880482 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.880673 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.880824 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.880980 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.881012 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.982932 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.983027 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.983055 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.983097 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c7ecd09-cd15-439d-9153-b55d9013bb83-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.983448 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.984260 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.984813 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c7ecd09-cd15-439d-9153-b55d9013bb83-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.984915 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c7ecd09-cd15-439d-9153-b55d9013bb83-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.988319 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c7ecd09-cd15-439d-9153-b55d9013bb83-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.988410 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c7ecd09-cd15-439d-9153-b55d9013bb83-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.988507 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c7ecd09-cd15-439d-9153-b55d9013bb83-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.988629 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c7ecd09-cd15-439d-9153-b55d9013bb83-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.988712 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65xc8\" (UniqueName: \"kubernetes.io/projected/4c7ecd09-cd15-439d-9153-b55d9013bb83-kube-api-access-65xc8\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.988736 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.988812 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.989257 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.990183 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c7ecd09-cd15-439d-9153-b55d9013bb83-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.992290 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c7ecd09-cd15-439d-9153-b55d9013bb83-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.992892 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.998789 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:19 crc kubenswrapper[4804]: I0217 13:50:19.015178 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65xc8\" (UniqueName: \"kubernetes.io/projected/4c7ecd09-cd15-439d-9153-b55d9013bb83-kube-api-access-65xc8\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:19 crc kubenswrapper[4804]: I0217 13:50:19.016626 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c7ecd09-cd15-439d-9153-b55d9013bb83-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:19 crc kubenswrapper[4804]: I0217 13:50:19.049772 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:19 crc kubenswrapper[4804]: I0217 13:50:19.347926 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:19 crc kubenswrapper[4804]: I0217 13:50:19.653283 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b5f204e4-3b7a-4490-9c78-def5bf30f810","Type":"ContainerStarted","Data":"bbcf95a398f7b865c615ecb45334f7dffb36eeee0c13f1cb51c751c688537e45"} Feb 17 13:50:19 crc kubenswrapper[4804]: I0217 13:50:19.876335 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 13:50:19 crc kubenswrapper[4804]: W0217 13:50:19.879034 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c7ecd09_cd15_439d_9153_b55d9013bb83.slice/crio-43fd8e308b41018c1084f87d8cf2d00eef3a0862921602fb8c2e4cd3b76b90d7 WatchSource:0}: Error finding container 43fd8e308b41018c1084f87d8cf2d00eef3a0862921602fb8c2e4cd3b76b90d7: Status 404 returned error can't find the container with id 43fd8e308b41018c1084f87d8cf2d00eef3a0862921602fb8c2e4cd3b76b90d7 Feb 17 13:50:20 crc kubenswrapper[4804]: I0217 13:50:20.586950 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" path="/var/lib/kubelet/pods/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad/volumes" Feb 17 13:50:20 crc kubenswrapper[4804]: I0217 13:50:20.670328 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4c7ecd09-cd15-439d-9153-b55d9013bb83","Type":"ContainerStarted","Data":"43fd8e308b41018c1084f87d8cf2d00eef3a0862921602fb8c2e4cd3b76b90d7"} Feb 17 13:50:20 crc kubenswrapper[4804]: I0217 13:50:20.863035 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d558885bc-h2xmr"] Feb 17 13:50:20 crc kubenswrapper[4804]: I0217 13:50:20.865303 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:20 crc kubenswrapper[4804]: I0217 13:50:20.867572 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 17 13:50:20 crc kubenswrapper[4804]: I0217 13:50:20.903584 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-h2xmr"] Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.027727 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gwlb\" (UniqueName: \"kubernetes.io/projected/21997817-4e92-40db-a990-377f9cb88575-kube-api-access-9gwlb\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.027977 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.028061 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-svc\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.028126 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.028163 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.028226 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-config\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.028427 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.130424 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.130477 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-svc\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.130504 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.130528 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.130556 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-config\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.130625 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.130700 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gwlb\" (UniqueName: \"kubernetes.io/projected/21997817-4e92-40db-a990-377f9cb88575-kube-api-access-9gwlb\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.131402 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.131412 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.131606 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.131686 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-config\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.132072 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-svc\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.132529 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.159262 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gwlb\" (UniqueName: \"kubernetes.io/projected/21997817-4e92-40db-a990-377f9cb88575-kube-api-access-9gwlb\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.198599 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: W0217 13:50:21.662734 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21997817_4e92_40db_a990_377f9cb88575.slice/crio-2b43909fd91c7907b0c4c4dede50166418cb4c4d2762c97bd67c0204aea48a8e WatchSource:0}: Error finding container 2b43909fd91c7907b0c4c4dede50166418cb4c4d2762c97bd67c0204aea48a8e: Status 404 returned error can't find the container with id 2b43909fd91c7907b0c4c4dede50166418cb4c4d2762c97bd67c0204aea48a8e Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.675049 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-h2xmr"] Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.688722 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4c7ecd09-cd15-439d-9153-b55d9013bb83","Type":"ContainerStarted","Data":"65ada82f256453be4311fa6e9f31586da6868d81949300ee7ece41d6b113174c"} Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.692317 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" event={"ID":"21997817-4e92-40db-a990-377f9cb88575","Type":"ContainerStarted","Data":"2b43909fd91c7907b0c4c4dede50166418cb4c4d2762c97bd67c0204aea48a8e"} Feb 17 13:50:22 crc kubenswrapper[4804]: I0217 13:50:22.703521 4804 generic.go:334] "Generic (PLEG): container finished" podID="21997817-4e92-40db-a990-377f9cb88575" containerID="f545c90aa94e26fbc7cec07ad68557bb9c70b4171384426ba058a0b4bbc12151" exitCode=0 Feb 17 13:50:22 crc kubenswrapper[4804]: I0217 13:50:22.703602 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" event={"ID":"21997817-4e92-40db-a990-377f9cb88575","Type":"ContainerDied","Data":"f545c90aa94e26fbc7cec07ad68557bb9c70b4171384426ba058a0b4bbc12151"} Feb 17 13:50:23 crc kubenswrapper[4804]: I0217 13:50:23.716033 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" event={"ID":"21997817-4e92-40db-a990-377f9cb88575","Type":"ContainerStarted","Data":"1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416"} Feb 17 13:50:23 crc kubenswrapper[4804]: I0217 13:50:23.716354 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:23 crc kubenswrapper[4804]: I0217 13:50:23.750549 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" podStartSLOduration=3.750530833 podStartE2EDuration="3.750530833s" podCreationTimestamp="2026-02-17 13:50:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:50:23.734326693 +0000 UTC m=+1497.845746030" watchObservedRunningTime="2026-02-17 13:50:23.750530833 +0000 UTC m=+1497.861950170" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.200361 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.285507 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6rhsw"] Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.286164 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" podUID="1fd51afd-ae34-4a67-bb79-a12d396968ef" containerName="dnsmasq-dns" containerID="cri-o://d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b" gracePeriod=10 Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.465341 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-2n5kn"] Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.475610 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.490660 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-2n5kn"] Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.524723 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-config\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.524899 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.524923 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.524972 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.525107 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.525285 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhqgn\" (UniqueName: \"kubernetes.io/projected/69619ab8-5a40-43b9-8e9c-1a6e39893605-kube-api-access-lhqgn\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.525424 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.627259 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhqgn\" (UniqueName: \"kubernetes.io/projected/69619ab8-5a40-43b9-8e9c-1a6e39893605-kube-api-access-lhqgn\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.627340 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.627424 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-config\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.627488 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.627506 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.627547 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.627639 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.628831 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.629742 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.632240 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.632792 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-config\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.633108 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.633363 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.647557 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhqgn\" (UniqueName: \"kubernetes.io/projected/69619ab8-5a40-43b9-8e9c-1a6e39893605-kube-api-access-lhqgn\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.796016 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.797779 4804 generic.go:334] "Generic (PLEG): container finished" podID="1fd51afd-ae34-4a67-bb79-a12d396968ef" containerID="d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b" exitCode=0 Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.797830 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" event={"ID":"1fd51afd-ae34-4a67-bb79-a12d396968ef","Type":"ContainerDied","Data":"d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b"} Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.797896 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" event={"ID":"1fd51afd-ae34-4a67-bb79-a12d396968ef","Type":"ContainerDied","Data":"a2838e3552cf9ee264c86b1e5acbfc8482d43bbc95f8a3776ff5253f31fed64a"} Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.797925 4804 scope.go:117] "RemoveContainer" containerID="d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.807181 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.829846 4804 scope.go:117] "RemoveContainer" containerID="205ba7bc73ab52ae1a2e677c3dd98e1d2935baea4c419316aadd89754cd8dfba" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.897562 4804 scope.go:117] "RemoveContainer" containerID="d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b" Feb 17 13:50:31 crc kubenswrapper[4804]: E0217 13:50:31.898101 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b\": container with ID starting with d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b not found: ID does not exist" containerID="d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.898131 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b"} err="failed to get container status \"d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b\": rpc error: code = NotFound desc = could not find container \"d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b\": container with ID starting with d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b not found: ID does not exist" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.898157 4804 scope.go:117] "RemoveContainer" containerID="205ba7bc73ab52ae1a2e677c3dd98e1d2935baea4c419316aadd89754cd8dfba" Feb 17 13:50:31 crc kubenswrapper[4804]: E0217 13:50:31.898542 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"205ba7bc73ab52ae1a2e677c3dd98e1d2935baea4c419316aadd89754cd8dfba\": container with ID starting with 205ba7bc73ab52ae1a2e677c3dd98e1d2935baea4c419316aadd89754cd8dfba not found: ID does not exist" containerID="205ba7bc73ab52ae1a2e677c3dd98e1d2935baea4c419316aadd89754cd8dfba" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.898579 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"205ba7bc73ab52ae1a2e677c3dd98e1d2935baea4c419316aadd89754cd8dfba"} err="failed to get container status \"205ba7bc73ab52ae1a2e677c3dd98e1d2935baea4c419316aadd89754cd8dfba\": rpc error: code = NotFound desc = could not find container \"205ba7bc73ab52ae1a2e677c3dd98e1d2935baea4c419316aadd89754cd8dfba\": container with ID starting with 205ba7bc73ab52ae1a2e677c3dd98e1d2935baea4c419316aadd89754cd8dfba not found: ID does not exist" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.938354 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-sb\") pod \"1fd51afd-ae34-4a67-bb79-a12d396968ef\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.938718 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-config\") pod \"1fd51afd-ae34-4a67-bb79-a12d396968ef\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.938751 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-nb\") pod \"1fd51afd-ae34-4a67-bb79-a12d396968ef\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.938795 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-svc\") pod \"1fd51afd-ae34-4a67-bb79-a12d396968ef\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.938846 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-swift-storage-0\") pod \"1fd51afd-ae34-4a67-bb79-a12d396968ef\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.938912 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jbtz\" (UniqueName: \"kubernetes.io/projected/1fd51afd-ae34-4a67-bb79-a12d396968ef-kube-api-access-5jbtz\") pod \"1fd51afd-ae34-4a67-bb79-a12d396968ef\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.945584 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fd51afd-ae34-4a67-bb79-a12d396968ef-kube-api-access-5jbtz" (OuterVolumeSpecName: "kube-api-access-5jbtz") pod "1fd51afd-ae34-4a67-bb79-a12d396968ef" (UID: "1fd51afd-ae34-4a67-bb79-a12d396968ef"). InnerVolumeSpecName "kube-api-access-5jbtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.999090 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1fd51afd-ae34-4a67-bb79-a12d396968ef" (UID: "1fd51afd-ae34-4a67-bb79-a12d396968ef"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.010583 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-config" (OuterVolumeSpecName: "config") pod "1fd51afd-ae34-4a67-bb79-a12d396968ef" (UID: "1fd51afd-ae34-4a67-bb79-a12d396968ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.020049 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1fd51afd-ae34-4a67-bb79-a12d396968ef" (UID: "1fd51afd-ae34-4a67-bb79-a12d396968ef"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.028637 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1fd51afd-ae34-4a67-bb79-a12d396968ef" (UID: "1fd51afd-ae34-4a67-bb79-a12d396968ef"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.030519 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1fd51afd-ae34-4a67-bb79-a12d396968ef" (UID: "1fd51afd-ae34-4a67-bb79-a12d396968ef"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.041635 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.041669 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.041682 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.041694 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.041705 4804 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.041719 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jbtz\" (UniqueName: \"kubernetes.io/projected/1fd51afd-ae34-4a67-bb79-a12d396968ef-kube-api-access-5jbtz\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.268258 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-2n5kn"] Feb 17 13:50:32 crc kubenswrapper[4804]: W0217 13:50:32.271599 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69619ab8_5a40_43b9_8e9c_1a6e39893605.slice/crio-a2833db21abb080065d466cefa071c427cf380547cde57194f66ad2a0a2a0bb8 WatchSource:0}: Error finding container a2833db21abb080065d466cefa071c427cf380547cde57194f66ad2a0a2a0bb8: Status 404 returned error can't find the container with id a2833db21abb080065d466cefa071c427cf380547cde57194f66ad2a0a2a0bb8 Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.810617 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.812370 4804 generic.go:334] "Generic (PLEG): container finished" podID="69619ab8-5a40-43b9-8e9c-1a6e39893605" containerID="b62be41e8ccaaa638798d42205e9b4a49fe20a084508b5c1cf72e1a16901037d" exitCode=0 Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.812466 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" event={"ID":"69619ab8-5a40-43b9-8e9c-1a6e39893605","Type":"ContainerDied","Data":"b62be41e8ccaaa638798d42205e9b4a49fe20a084508b5c1cf72e1a16901037d"} Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.812530 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" event={"ID":"69619ab8-5a40-43b9-8e9c-1a6e39893605","Type":"ContainerStarted","Data":"a2833db21abb080065d466cefa071c427cf380547cde57194f66ad2a0a2a0bb8"} Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.865846 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6rhsw"] Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.871689 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6rhsw"] Feb 17 13:50:33 crc kubenswrapper[4804]: I0217 13:50:33.822527 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" event={"ID":"69619ab8-5a40-43b9-8e9c-1a6e39893605","Type":"ContainerStarted","Data":"7588b9a633ef9fa67028868a7659fef6354d900704f38d8b0fff91f9262c248f"} Feb 17 13:50:33 crc kubenswrapper[4804]: I0217 13:50:33.823082 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:33 crc kubenswrapper[4804]: I0217 13:50:33.849304 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" podStartSLOduration=2.849283142 podStartE2EDuration="2.849283142s" podCreationTimestamp="2026-02-17 13:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:50:33.843801689 +0000 UTC m=+1507.955221026" watchObservedRunningTime="2026-02-17 13:50:33.849283142 +0000 UTC m=+1507.960702479" Feb 17 13:50:34 crc kubenswrapper[4804]: I0217 13:50:34.584617 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fd51afd-ae34-4a67-bb79-a12d396968ef" path="/var/lib/kubelet/pods/1fd51afd-ae34-4a67-bb79-a12d396968ef/volumes" Feb 17 13:50:41 crc kubenswrapper[4804]: I0217 13:50:41.809261 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:41 crc kubenswrapper[4804]: I0217 13:50:41.879775 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-h2xmr"] Feb 17 13:50:41 crc kubenswrapper[4804]: I0217 13:50:41.880048 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" podUID="21997817-4e92-40db-a990-377f9cb88575" containerName="dnsmasq-dns" containerID="cri-o://1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416" gracePeriod=10 Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.410258 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.533816 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gwlb\" (UniqueName: \"kubernetes.io/projected/21997817-4e92-40db-a990-377f9cb88575-kube-api-access-9gwlb\") pod \"21997817-4e92-40db-a990-377f9cb88575\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.533884 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-config\") pod \"21997817-4e92-40db-a990-377f9cb88575\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.533946 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-sb\") pod \"21997817-4e92-40db-a990-377f9cb88575\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.534016 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-openstack-edpm-ipam\") pod \"21997817-4e92-40db-a990-377f9cb88575\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.534078 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-nb\") pod \"21997817-4e92-40db-a990-377f9cb88575\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.534093 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-swift-storage-0\") pod \"21997817-4e92-40db-a990-377f9cb88575\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.534129 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-svc\") pod \"21997817-4e92-40db-a990-377f9cb88575\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.542999 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21997817-4e92-40db-a990-377f9cb88575-kube-api-access-9gwlb" (OuterVolumeSpecName: "kube-api-access-9gwlb") pod "21997817-4e92-40db-a990-377f9cb88575" (UID: "21997817-4e92-40db-a990-377f9cb88575"). InnerVolumeSpecName "kube-api-access-9gwlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.587349 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "21997817-4e92-40db-a990-377f9cb88575" (UID: "21997817-4e92-40db-a990-377f9cb88575"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.594904 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "21997817-4e92-40db-a990-377f9cb88575" (UID: "21997817-4e92-40db-a990-377f9cb88575"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.599490 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-config" (OuterVolumeSpecName: "config") pod "21997817-4e92-40db-a990-377f9cb88575" (UID: "21997817-4e92-40db-a990-377f9cb88575"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.624639 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "21997817-4e92-40db-a990-377f9cb88575" (UID: "21997817-4e92-40db-a990-377f9cb88575"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.636490 4804 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.636521 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.636531 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gwlb\" (UniqueName: \"kubernetes.io/projected/21997817-4e92-40db-a990-377f9cb88575-kube-api-access-9gwlb\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.636541 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.636549 4804 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.637356 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "21997817-4e92-40db-a990-377f9cb88575" (UID: "21997817-4e92-40db-a990-377f9cb88575"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.637598 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "21997817-4e92-40db-a990-377f9cb88575" (UID: "21997817-4e92-40db-a990-377f9cb88575"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.738214 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.738440 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.901164 4804 generic.go:334] "Generic (PLEG): container finished" podID="21997817-4e92-40db-a990-377f9cb88575" containerID="1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416" exitCode=0 Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.901277 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.901281 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" event={"ID":"21997817-4e92-40db-a990-377f9cb88575","Type":"ContainerDied","Data":"1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416"} Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.901415 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" event={"ID":"21997817-4e92-40db-a990-377f9cb88575","Type":"ContainerDied","Data":"2b43909fd91c7907b0c4c4dede50166418cb4c4d2762c97bd67c0204aea48a8e"} Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.901439 4804 scope.go:117] "RemoveContainer" containerID="1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.928170 4804 scope.go:117] "RemoveContainer" containerID="f545c90aa94e26fbc7cec07ad68557bb9c70b4171384426ba058a0b4bbc12151" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.944927 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-h2xmr"] Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.952450 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-h2xmr"] Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.972714 4804 scope.go:117] "RemoveContainer" containerID="1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416" Feb 17 13:50:42 crc kubenswrapper[4804]: E0217 13:50:42.973273 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416\": container with ID starting with 1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416 not found: ID does not exist" containerID="1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.973323 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416"} err="failed to get container status \"1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416\": rpc error: code = NotFound desc = could not find container \"1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416\": container with ID starting with 1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416 not found: ID does not exist" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.973379 4804 scope.go:117] "RemoveContainer" containerID="f545c90aa94e26fbc7cec07ad68557bb9c70b4171384426ba058a0b4bbc12151" Feb 17 13:50:42 crc kubenswrapper[4804]: E0217 13:50:42.974934 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f545c90aa94e26fbc7cec07ad68557bb9c70b4171384426ba058a0b4bbc12151\": container with ID starting with f545c90aa94e26fbc7cec07ad68557bb9c70b4171384426ba058a0b4bbc12151 not found: ID does not exist" containerID="f545c90aa94e26fbc7cec07ad68557bb9c70b4171384426ba058a0b4bbc12151" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.974968 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f545c90aa94e26fbc7cec07ad68557bb9c70b4171384426ba058a0b4bbc12151"} err="failed to get container status \"f545c90aa94e26fbc7cec07ad68557bb9c70b4171384426ba058a0b4bbc12151\": rpc error: code = NotFound desc = could not find container \"f545c90aa94e26fbc7cec07ad68557bb9c70b4171384426ba058a0b4bbc12151\": container with ID starting with f545c90aa94e26fbc7cec07ad68557bb9c70b4171384426ba058a0b4bbc12151 not found: ID does not exist" Feb 17 13:50:44 crc kubenswrapper[4804]: I0217 13:50:44.590251 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21997817-4e92-40db-a990-377f9cb88575" path="/var/lib/kubelet/pods/21997817-4e92-40db-a990-377f9cb88575/volumes" Feb 17 13:50:51 crc kubenswrapper[4804]: I0217 13:50:51.991359 4804 generic.go:334] "Generic (PLEG): container finished" podID="b5f204e4-3b7a-4490-9c78-def5bf30f810" containerID="bbcf95a398f7b865c615ecb45334f7dffb36eeee0c13f1cb51c751c688537e45" exitCode=0 Feb 17 13:50:51 crc kubenswrapper[4804]: I0217 13:50:51.991467 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b5f204e4-3b7a-4490-9c78-def5bf30f810","Type":"ContainerDied","Data":"bbcf95a398f7b865c615ecb45334f7dffb36eeee0c13f1cb51c751c688537e45"} Feb 17 13:50:53 crc kubenswrapper[4804]: I0217 13:50:53.003356 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b5f204e4-3b7a-4490-9c78-def5bf30f810","Type":"ContainerStarted","Data":"f37813e04ded1da97dc64ecb8e05603ecfccc96d0e8da449644b588c033afc96"} Feb 17 13:50:53 crc kubenswrapper[4804]: I0217 13:50:53.004059 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 17 13:50:53 crc kubenswrapper[4804]: I0217 13:50:53.041810 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.041789558 podStartE2EDuration="37.041789558s" podCreationTimestamp="2026-02-17 13:50:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:50:53.027286022 +0000 UTC m=+1527.138705439" watchObservedRunningTime="2026-02-17 13:50:53.041789558 +0000 UTC m=+1527.153208905" Feb 17 13:50:54 crc kubenswrapper[4804]: I0217 13:50:54.013526 4804 generic.go:334] "Generic (PLEG): container finished" podID="4c7ecd09-cd15-439d-9153-b55d9013bb83" containerID="65ada82f256453be4311fa6e9f31586da6868d81949300ee7ece41d6b113174c" exitCode=0 Feb 17 13:50:54 crc kubenswrapper[4804]: I0217 13:50:54.014520 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4c7ecd09-cd15-439d-9153-b55d9013bb83","Type":"ContainerDied","Data":"65ada82f256453be4311fa6e9f31586da6868d81949300ee7ece41d6b113174c"} Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.025414 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4c7ecd09-cd15-439d-9153-b55d9013bb83","Type":"ContainerStarted","Data":"08e4011d2774f96d2bbb1f16dca387dc71133a1e4c6dfe8de710cb158c3a61a1"} Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.027114 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.056347 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.056327438 podStartE2EDuration="37.056327438s" podCreationTimestamp="2026-02-17 13:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:50:55.052158667 +0000 UTC m=+1529.163578024" watchObservedRunningTime="2026-02-17 13:50:55.056327438 +0000 UTC m=+1529.167746775" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.079766 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst"] Feb 17 13:50:55 crc kubenswrapper[4804]: E0217 13:50:55.080815 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fd51afd-ae34-4a67-bb79-a12d396968ef" containerName="dnsmasq-dns" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.080843 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fd51afd-ae34-4a67-bb79-a12d396968ef" containerName="dnsmasq-dns" Feb 17 13:50:55 crc kubenswrapper[4804]: E0217 13:50:55.080892 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fd51afd-ae34-4a67-bb79-a12d396968ef" containerName="init" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.080903 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fd51afd-ae34-4a67-bb79-a12d396968ef" containerName="init" Feb 17 13:50:55 crc kubenswrapper[4804]: E0217 13:50:55.080934 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21997817-4e92-40db-a990-377f9cb88575" containerName="init" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.080946 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="21997817-4e92-40db-a990-377f9cb88575" containerName="init" Feb 17 13:50:55 crc kubenswrapper[4804]: E0217 13:50:55.080964 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21997817-4e92-40db-a990-377f9cb88575" containerName="dnsmasq-dns" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.080971 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="21997817-4e92-40db-a990-377f9cb88575" containerName="dnsmasq-dns" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.081498 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="21997817-4e92-40db-a990-377f9cb88575" containerName="dnsmasq-dns" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.081544 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fd51afd-ae34-4a67-bb79-a12d396968ef" containerName="dnsmasq-dns" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.082542 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.088543 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.088977 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.089284 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.101732 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.137032 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst"] Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.188643 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s77m\" (UniqueName: \"kubernetes.io/projected/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-kube-api-access-9s77m\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.188766 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.188788 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.188806 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.290923 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s77m\" (UniqueName: \"kubernetes.io/projected/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-kube-api-access-9s77m\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.291069 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.291096 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.291123 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.296073 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.296358 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.297875 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.320882 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s77m\" (UniqueName: \"kubernetes.io/projected/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-kube-api-access-9s77m\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.407560 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.990624 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst"] Feb 17 13:50:56 crc kubenswrapper[4804]: I0217 13:50:56.000968 4804 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 13:50:56 crc kubenswrapper[4804]: I0217 13:50:56.052937 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" event={"ID":"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd","Type":"ContainerStarted","Data":"05e7fc719fab70095a5b91a4cef4c9ad73a02dbe075b07303a0bfd47bc2532bd"} Feb 17 13:51:07 crc kubenswrapper[4804]: I0217 13:51:07.326602 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 17 13:51:09 crc kubenswrapper[4804]: I0217 13:51:09.351393 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:51:09 crc kubenswrapper[4804]: I0217 13:51:09.821156 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gp6sw"] Feb 17 13:51:09 crc kubenswrapper[4804]: I0217 13:51:09.823637 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:09 crc kubenswrapper[4804]: I0217 13:51:09.831873 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gp6sw"] Feb 17 13:51:09 crc kubenswrapper[4804]: I0217 13:51:09.897661 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-utilities\") pod \"certified-operators-gp6sw\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:09 crc kubenswrapper[4804]: I0217 13:51:09.897792 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9m5f\" (UniqueName: \"kubernetes.io/projected/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-kube-api-access-l9m5f\") pod \"certified-operators-gp6sw\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:09 crc kubenswrapper[4804]: I0217 13:51:09.897885 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-catalog-content\") pod \"certified-operators-gp6sw\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:10 crc kubenswrapper[4804]: I0217 13:51:10.000990 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-catalog-content\") pod \"certified-operators-gp6sw\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:10 crc kubenswrapper[4804]: I0217 13:51:10.001174 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-utilities\") pod \"certified-operators-gp6sw\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:10 crc kubenswrapper[4804]: I0217 13:51:10.001277 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9m5f\" (UniqueName: \"kubernetes.io/projected/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-kube-api-access-l9m5f\") pod \"certified-operators-gp6sw\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:10 crc kubenswrapper[4804]: I0217 13:51:10.002067 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-utilities\") pod \"certified-operators-gp6sw\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:10 crc kubenswrapper[4804]: I0217 13:51:10.002132 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-catalog-content\") pod \"certified-operators-gp6sw\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:10 crc kubenswrapper[4804]: I0217 13:51:10.022909 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9m5f\" (UniqueName: \"kubernetes.io/projected/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-kube-api-access-l9m5f\") pod \"certified-operators-gp6sw\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:10 crc kubenswrapper[4804]: I0217 13:51:10.146652 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:11 crc kubenswrapper[4804]: I0217 13:51:11.230137 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" event={"ID":"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd","Type":"ContainerStarted","Data":"0232936103e32990e0e1d2addaaa872375bd84efce8c9daca39d54b7e7e36e24"} Feb 17 13:51:11 crc kubenswrapper[4804]: I0217 13:51:11.255961 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" podStartSLOduration=1.299775653 podStartE2EDuration="16.255942397s" podCreationTimestamp="2026-02-17 13:50:55 +0000 UTC" firstStartedPulling="2026-02-17 13:50:56.000559743 +0000 UTC m=+1530.111979120" lastFinishedPulling="2026-02-17 13:51:10.956726527 +0000 UTC m=+1545.068145864" observedRunningTime="2026-02-17 13:51:11.247388089 +0000 UTC m=+1545.358807426" watchObservedRunningTime="2026-02-17 13:51:11.255942397 +0000 UTC m=+1545.367361734" Feb 17 13:51:11 crc kubenswrapper[4804]: I0217 13:51:11.421958 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gp6sw"] Feb 17 13:51:11 crc kubenswrapper[4804]: W0217 13:51:11.430441 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podceeb88bc_5350_49f1_9a6b_0ecb88f986e4.slice/crio-a85d096fbf07b9a8e92b330284d80c0c7cd763b8369a93e042fb58cb78dcbd2b WatchSource:0}: Error finding container a85d096fbf07b9a8e92b330284d80c0c7cd763b8369a93e042fb58cb78dcbd2b: Status 404 returned error can't find the container with id a85d096fbf07b9a8e92b330284d80c0c7cd763b8369a93e042fb58cb78dcbd2b Feb 17 13:51:12 crc kubenswrapper[4804]: I0217 13:51:12.246289 4804 generic.go:334] "Generic (PLEG): container finished" podID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" containerID="ea493416ba3c0cbb2f9663a54ae4bfe816e4d282842e5140c568a1be6cad8519" exitCode=0 Feb 17 13:51:12 crc kubenswrapper[4804]: I0217 13:51:12.246383 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp6sw" event={"ID":"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4","Type":"ContainerDied","Data":"ea493416ba3c0cbb2f9663a54ae4bfe816e4d282842e5140c568a1be6cad8519"} Feb 17 13:51:12 crc kubenswrapper[4804]: I0217 13:51:12.247442 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp6sw" event={"ID":"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4","Type":"ContainerStarted","Data":"a85d096fbf07b9a8e92b330284d80c0c7cd763b8369a93e042fb58cb78dcbd2b"} Feb 17 13:51:13 crc kubenswrapper[4804]: I0217 13:51:13.264459 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp6sw" event={"ID":"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4","Type":"ContainerStarted","Data":"6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2"} Feb 17 13:51:14 crc kubenswrapper[4804]: I0217 13:51:14.276630 4804 generic.go:334] "Generic (PLEG): container finished" podID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" containerID="6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2" exitCode=0 Feb 17 13:51:14 crc kubenswrapper[4804]: I0217 13:51:14.276753 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp6sw" event={"ID":"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4","Type":"ContainerDied","Data":"6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2"} Feb 17 13:51:15 crc kubenswrapper[4804]: I0217 13:51:15.287579 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp6sw" event={"ID":"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4","Type":"ContainerStarted","Data":"1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb"} Feb 17 13:51:15 crc kubenswrapper[4804]: I0217 13:51:15.313364 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gp6sw" podStartSLOduration=3.9140360100000002 podStartE2EDuration="6.313343978s" podCreationTimestamp="2026-02-17 13:51:09 +0000 UTC" firstStartedPulling="2026-02-17 13:51:12.249813062 +0000 UTC m=+1546.361232429" lastFinishedPulling="2026-02-17 13:51:14.64912102 +0000 UTC m=+1548.760540397" observedRunningTime="2026-02-17 13:51:15.305481 +0000 UTC m=+1549.416900337" watchObservedRunningTime="2026-02-17 13:51:15.313343978 +0000 UTC m=+1549.424763315" Feb 17 13:51:20 crc kubenswrapper[4804]: I0217 13:51:20.147751 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:20 crc kubenswrapper[4804]: I0217 13:51:20.148420 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:20 crc kubenswrapper[4804]: I0217 13:51:20.201168 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:20 crc kubenswrapper[4804]: I0217 13:51:20.379540 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:20 crc kubenswrapper[4804]: I0217 13:51:20.441550 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gp6sw"] Feb 17 13:51:22 crc kubenswrapper[4804]: I0217 13:51:22.354286 4804 generic.go:334] "Generic (PLEG): container finished" podID="ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd" containerID="0232936103e32990e0e1d2addaaa872375bd84efce8c9daca39d54b7e7e36e24" exitCode=0 Feb 17 13:51:22 crc kubenswrapper[4804]: I0217 13:51:22.354364 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" event={"ID":"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd","Type":"ContainerDied","Data":"0232936103e32990e0e1d2addaaa872375bd84efce8c9daca39d54b7e7e36e24"} Feb 17 13:51:22 crc kubenswrapper[4804]: I0217 13:51:22.354704 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gp6sw" podUID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" containerName="registry-server" containerID="cri-o://1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb" gracePeriod=2 Feb 17 13:51:22 crc kubenswrapper[4804]: I0217 13:51:22.848621 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:22 crc kubenswrapper[4804]: I0217 13:51:22.985443 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-catalog-content\") pod \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " Feb 17 13:51:22 crc kubenswrapper[4804]: I0217 13:51:22.985629 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-utilities\") pod \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " Feb 17 13:51:22 crc kubenswrapper[4804]: I0217 13:51:22.985742 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9m5f\" (UniqueName: \"kubernetes.io/projected/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-kube-api-access-l9m5f\") pod \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " Feb 17 13:51:22 crc kubenswrapper[4804]: I0217 13:51:22.986946 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-utilities" (OuterVolumeSpecName: "utilities") pod "ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" (UID: "ceeb88bc-5350-49f1-9a6b-0ecb88f986e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:51:22 crc kubenswrapper[4804]: I0217 13:51:22.993098 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-kube-api-access-l9m5f" (OuterVolumeSpecName: "kube-api-access-l9m5f") pod "ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" (UID: "ceeb88bc-5350-49f1-9a6b-0ecb88f986e4"). InnerVolumeSpecName "kube-api-access-l9m5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.052066 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" (UID: "ceeb88bc-5350-49f1-9a6b-0ecb88f986e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.088007 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.088052 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.088067 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9m5f\" (UniqueName: \"kubernetes.io/projected/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-kube-api-access-l9m5f\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.365388 4804 generic.go:334] "Generic (PLEG): container finished" podID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" containerID="1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb" exitCode=0 Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.365503 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp6sw" event={"ID":"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4","Type":"ContainerDied","Data":"1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb"} Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.365752 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp6sw" event={"ID":"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4","Type":"ContainerDied","Data":"a85d096fbf07b9a8e92b330284d80c0c7cd763b8369a93e042fb58cb78dcbd2b"} Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.365779 4804 scope.go:117] "RemoveContainer" containerID="1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb" Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.365553 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.413451 4804 scope.go:117] "RemoveContainer" containerID="6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2" Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.424116 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gp6sw"] Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.432340 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gp6sw"] Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.503279 4804 scope.go:117] "RemoveContainer" containerID="ea493416ba3c0cbb2f9663a54ae4bfe816e4d282842e5140c568a1be6cad8519" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:23.571053 4804 scope.go:117] "RemoveContainer" containerID="1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb" Feb 17 13:51:24 crc kubenswrapper[4804]: E0217 13:51:23.571465 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb\": container with ID starting with 1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb not found: ID does not exist" containerID="1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:23.571502 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb"} err="failed to get container status \"1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb\": rpc error: code = NotFound desc = could not find container \"1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb\": container with ID starting with 1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb not found: ID does not exist" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:23.571530 4804 scope.go:117] "RemoveContainer" containerID="6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2" Feb 17 13:51:24 crc kubenswrapper[4804]: E0217 13:51:23.571822 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2\": container with ID starting with 6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2 not found: ID does not exist" containerID="6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:23.571851 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2"} err="failed to get container status \"6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2\": rpc error: code = NotFound desc = could not find container \"6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2\": container with ID starting with 6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2 not found: ID does not exist" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:23.571863 4804 scope.go:117] "RemoveContainer" containerID="ea493416ba3c0cbb2f9663a54ae4bfe816e4d282842e5140c568a1be6cad8519" Feb 17 13:51:24 crc kubenswrapper[4804]: E0217 13:51:23.572733 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea493416ba3c0cbb2f9663a54ae4bfe816e4d282842e5140c568a1be6cad8519\": container with ID starting with ea493416ba3c0cbb2f9663a54ae4bfe816e4d282842e5140c568a1be6cad8519 not found: ID does not exist" containerID="ea493416ba3c0cbb2f9663a54ae4bfe816e4d282842e5140c568a1be6cad8519" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:23.572751 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea493416ba3c0cbb2f9663a54ae4bfe816e4d282842e5140c568a1be6cad8519"} err="failed to get container status \"ea493416ba3c0cbb2f9663a54ae4bfe816e4d282842e5140c568a1be6cad8519\": rpc error: code = NotFound desc = could not find container \"ea493416ba3c0cbb2f9663a54ae4bfe816e4d282842e5140c568a1be6cad8519\": container with ID starting with ea493416ba3c0cbb2f9663a54ae4bfe816e4d282842e5140c568a1be6cad8519 not found: ID does not exist" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.377885 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" event={"ID":"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd","Type":"ContainerDied","Data":"05e7fc719fab70095a5b91a4cef4c9ad73a02dbe075b07303a0bfd47bc2532bd"} Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.378252 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05e7fc719fab70095a5b91a4cef4c9ad73a02dbe075b07303a0bfd47bc2532bd" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.454038 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.585715 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" path="/var/lib/kubelet/pods/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4/volumes" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.620770 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-repo-setup-combined-ca-bundle\") pod \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.621045 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-ssh-key-openstack-edpm-ipam\") pod \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.621921 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-inventory\") pod \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.622045 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s77m\" (UniqueName: \"kubernetes.io/projected/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-kube-api-access-9s77m\") pod \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.627877 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-kube-api-access-9s77m" (OuterVolumeSpecName: "kube-api-access-9s77m") pod "ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd" (UID: "ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd"). InnerVolumeSpecName "kube-api-access-9s77m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.628002 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd" (UID: "ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.651379 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-inventory" (OuterVolumeSpecName: "inventory") pod "ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd" (UID: "ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.651388 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd" (UID: "ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.725262 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s77m\" (UniqueName: \"kubernetes.io/projected/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-kube-api-access-9s77m\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.725301 4804 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.725312 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.725324 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.390807 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.561217 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f"] Feb 17 13:51:25 crc kubenswrapper[4804]: E0217 13:51:25.561850 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.561886 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 17 13:51:25 crc kubenswrapper[4804]: E0217 13:51:25.561917 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" containerName="extract-utilities" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.561930 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" containerName="extract-utilities" Feb 17 13:51:25 crc kubenswrapper[4804]: E0217 13:51:25.561973 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" containerName="extract-content" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.561985 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" containerName="extract-content" Feb 17 13:51:25 crc kubenswrapper[4804]: E0217 13:51:25.562001 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" containerName="registry-server" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.562012 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" containerName="registry-server" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.562339 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" containerName="registry-server" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.562390 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.563472 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.565613 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.566007 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.566233 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.568411 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.569000 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f"] Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.642109 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9rtz\" (UniqueName: \"kubernetes.io/projected/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-kube-api-access-z9rtz\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z6s9f\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.642699 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z6s9f\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.642843 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z6s9f\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.744738 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9rtz\" (UniqueName: \"kubernetes.io/projected/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-kube-api-access-z9rtz\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z6s9f\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.744836 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z6s9f\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.744866 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z6s9f\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.750960 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z6s9f\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.751713 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z6s9f\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.761271 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9rtz\" (UniqueName: \"kubernetes.io/projected/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-kube-api-access-z9rtz\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z6s9f\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.880479 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:26 crc kubenswrapper[4804]: I0217 13:51:26.415688 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f"] Feb 17 13:51:26 crc kubenswrapper[4804]: I0217 13:51:26.631805 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.178565 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nb9wd"] Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.182329 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.195949 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nb9wd"] Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.371500 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhqb5\" (UniqueName: \"kubernetes.io/projected/4b5520af-e860-4937-af9c-049b304c0cf9-kube-api-access-bhqb5\") pod \"redhat-marketplace-nb9wd\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.371546 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-utilities\") pod \"redhat-marketplace-nb9wd\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.371693 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-catalog-content\") pod \"redhat-marketplace-nb9wd\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.409109 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" event={"ID":"c87b0376-c505-452b-90ed-0e6bb7e6e8e0","Type":"ContainerStarted","Data":"4a217d28653fcb3108dc054ac2dd9db14b19f53aeacc55277c807dba99e6cd5f"} Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.409159 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" event={"ID":"c87b0376-c505-452b-90ed-0e6bb7e6e8e0","Type":"ContainerStarted","Data":"0db74678f890e06c2b9958a4e27efc6ebec25a1ff0a24b96d6b328c59548fcfc"} Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.427136 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" podStartSLOduration=2.231576782 podStartE2EDuration="2.427114015s" podCreationTimestamp="2026-02-17 13:51:25 +0000 UTC" firstStartedPulling="2026-02-17 13:51:26.431580078 +0000 UTC m=+1560.542999415" lastFinishedPulling="2026-02-17 13:51:26.627117311 +0000 UTC m=+1560.738536648" observedRunningTime="2026-02-17 13:51:27.422992645 +0000 UTC m=+1561.534411982" watchObservedRunningTime="2026-02-17 13:51:27.427114015 +0000 UTC m=+1561.538533352" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.473456 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhqb5\" (UniqueName: \"kubernetes.io/projected/4b5520af-e860-4937-af9c-049b304c0cf9-kube-api-access-bhqb5\") pod \"redhat-marketplace-nb9wd\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.473519 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-utilities\") pod \"redhat-marketplace-nb9wd\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.473776 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-catalog-content\") pod \"redhat-marketplace-nb9wd\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.474023 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-utilities\") pod \"redhat-marketplace-nb9wd\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.474399 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-catalog-content\") pod \"redhat-marketplace-nb9wd\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.498442 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhqb5\" (UniqueName: \"kubernetes.io/projected/4b5520af-e860-4937-af9c-049b304c0cf9-kube-api-access-bhqb5\") pod \"redhat-marketplace-nb9wd\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.501019 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:28 crc kubenswrapper[4804]: I0217 13:51:28.008001 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nb9wd"] Feb 17 13:51:28 crc kubenswrapper[4804]: I0217 13:51:28.419432 4804 generic.go:334] "Generic (PLEG): container finished" podID="4b5520af-e860-4937-af9c-049b304c0cf9" containerID="1b7acd23a0c9385c3b2b090c72ed4c69fd030cf041066eb9a7870d78777b66c0" exitCode=0 Feb 17 13:51:28 crc kubenswrapper[4804]: I0217 13:51:28.419532 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nb9wd" event={"ID":"4b5520af-e860-4937-af9c-049b304c0cf9","Type":"ContainerDied","Data":"1b7acd23a0c9385c3b2b090c72ed4c69fd030cf041066eb9a7870d78777b66c0"} Feb 17 13:51:28 crc kubenswrapper[4804]: I0217 13:51:28.420423 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nb9wd" event={"ID":"4b5520af-e860-4937-af9c-049b304c0cf9","Type":"ContainerStarted","Data":"5096b0ade58765cbb70c123fde8ddf796f5301f72982d1f2729abe092a910d91"} Feb 17 13:51:29 crc kubenswrapper[4804]: I0217 13:51:29.430505 4804 generic.go:334] "Generic (PLEG): container finished" podID="c87b0376-c505-452b-90ed-0e6bb7e6e8e0" containerID="4a217d28653fcb3108dc054ac2dd9db14b19f53aeacc55277c807dba99e6cd5f" exitCode=0 Feb 17 13:51:29 crc kubenswrapper[4804]: I0217 13:51:29.430610 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" event={"ID":"c87b0376-c505-452b-90ed-0e6bb7e6e8e0","Type":"ContainerDied","Data":"4a217d28653fcb3108dc054ac2dd9db14b19f53aeacc55277c807dba99e6cd5f"} Feb 17 13:51:29 crc kubenswrapper[4804]: I0217 13:51:29.436105 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nb9wd" event={"ID":"4b5520af-e860-4937-af9c-049b304c0cf9","Type":"ContainerStarted","Data":"a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4"} Feb 17 13:51:30 crc kubenswrapper[4804]: I0217 13:51:30.445547 4804 generic.go:334] "Generic (PLEG): container finished" podID="4b5520af-e860-4937-af9c-049b304c0cf9" containerID="a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4" exitCode=0 Feb 17 13:51:30 crc kubenswrapper[4804]: I0217 13:51:30.445620 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nb9wd" event={"ID":"4b5520af-e860-4937-af9c-049b304c0cf9","Type":"ContainerDied","Data":"a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4"} Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.008671 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.170395 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-inventory\") pod \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.170467 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-ssh-key-openstack-edpm-ipam\") pod \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.170634 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9rtz\" (UniqueName: \"kubernetes.io/projected/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-kube-api-access-z9rtz\") pod \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.176810 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-kube-api-access-z9rtz" (OuterVolumeSpecName: "kube-api-access-z9rtz") pod "c87b0376-c505-452b-90ed-0e6bb7e6e8e0" (UID: "c87b0376-c505-452b-90ed-0e6bb7e6e8e0"). InnerVolumeSpecName "kube-api-access-z9rtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.201456 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-inventory" (OuterVolumeSpecName: "inventory") pod "c87b0376-c505-452b-90ed-0e6bb7e6e8e0" (UID: "c87b0376-c505-452b-90ed-0e6bb7e6e8e0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.201485 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c87b0376-c505-452b-90ed-0e6bb7e6e8e0" (UID: "c87b0376-c505-452b-90ed-0e6bb7e6e8e0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.273237 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.273272 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.273286 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9rtz\" (UniqueName: \"kubernetes.io/projected/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-kube-api-access-z9rtz\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.455766 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" event={"ID":"c87b0376-c505-452b-90ed-0e6bb7e6e8e0","Type":"ContainerDied","Data":"0db74678f890e06c2b9958a4e27efc6ebec25a1ff0a24b96d6b328c59548fcfc"} Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.455825 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0db74678f890e06c2b9958a4e27efc6ebec25a1ff0a24b96d6b328c59548fcfc" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.455821 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.530749 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p"] Feb 17 13:51:31 crc kubenswrapper[4804]: E0217 13:51:31.531274 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87b0376-c505-452b-90ed-0e6bb7e6e8e0" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.531299 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87b0376-c505-452b-90ed-0e6bb7e6e8e0" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.531548 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87b0376-c505-452b-90ed-0e6bb7e6e8e0" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.532334 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.535448 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.535480 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.536296 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.537053 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.553603 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p"] Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.679836 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.679937 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpnt6\" (UniqueName: \"kubernetes.io/projected/9ee075c2-2363-4446-8545-dfdece6ca4da-kube-api-access-bpnt6\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.680149 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.680370 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.782156 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.782297 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.782344 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpnt6\" (UniqueName: \"kubernetes.io/projected/9ee075c2-2363-4446-8545-dfdece6ca4da-kube-api-access-bpnt6\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.782444 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.786702 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.786722 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.790720 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.800932 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpnt6\" (UniqueName: \"kubernetes.io/projected/9ee075c2-2363-4446-8545-dfdece6ca4da-kube-api-access-bpnt6\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.853026 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:32 crc kubenswrapper[4804]: I0217 13:51:32.408138 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p"] Feb 17 13:51:32 crc kubenswrapper[4804]: W0217 13:51:32.419166 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ee075c2_2363_4446_8545_dfdece6ca4da.slice/crio-18b9b9e278e8c83dcf92ce23e64cdae2695221512b4169c9866fb3e6753d918e WatchSource:0}: Error finding container 18b9b9e278e8c83dcf92ce23e64cdae2695221512b4169c9866fb3e6753d918e: Status 404 returned error can't find the container with id 18b9b9e278e8c83dcf92ce23e64cdae2695221512b4169c9866fb3e6753d918e Feb 17 13:51:32 crc kubenswrapper[4804]: I0217 13:51:32.471406 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nb9wd" event={"ID":"4b5520af-e860-4937-af9c-049b304c0cf9","Type":"ContainerStarted","Data":"905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9"} Feb 17 13:51:32 crc kubenswrapper[4804]: I0217 13:51:32.475253 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" event={"ID":"9ee075c2-2363-4446-8545-dfdece6ca4da","Type":"ContainerStarted","Data":"18b9b9e278e8c83dcf92ce23e64cdae2695221512b4169c9866fb3e6753d918e"} Feb 17 13:51:32 crc kubenswrapper[4804]: I0217 13:51:32.493976 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nb9wd" podStartSLOduration=1.747494457 podStartE2EDuration="5.493955989s" podCreationTimestamp="2026-02-17 13:51:27 +0000 UTC" firstStartedPulling="2026-02-17 13:51:28.421189835 +0000 UTC m=+1562.532609172" lastFinishedPulling="2026-02-17 13:51:32.167651367 +0000 UTC m=+1566.279070704" observedRunningTime="2026-02-17 13:51:32.489971363 +0000 UTC m=+1566.601390700" watchObservedRunningTime="2026-02-17 13:51:32.493955989 +0000 UTC m=+1566.605375336" Feb 17 13:51:33 crc kubenswrapper[4804]: I0217 13:51:33.486962 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" event={"ID":"9ee075c2-2363-4446-8545-dfdece6ca4da","Type":"ContainerStarted","Data":"c30c97b714db6eaaea3d99e426020e3d5b0cd168a7762b36fc6e65e7574bc11f"} Feb 17 13:51:33 crc kubenswrapper[4804]: I0217 13:51:33.506451 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" podStartSLOduration=2.328123494 podStartE2EDuration="2.506430447s" podCreationTimestamp="2026-02-17 13:51:31 +0000 UTC" firstStartedPulling="2026-02-17 13:51:32.422122911 +0000 UTC m=+1566.533542258" lastFinishedPulling="2026-02-17 13:51:32.600429874 +0000 UTC m=+1566.711849211" observedRunningTime="2026-02-17 13:51:33.50334423 +0000 UTC m=+1567.614763577" watchObservedRunningTime="2026-02-17 13:51:33.506430447 +0000 UTC m=+1567.617849784" Feb 17 13:51:37 crc kubenswrapper[4804]: I0217 13:51:37.501664 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:37 crc kubenswrapper[4804]: I0217 13:51:37.502874 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:37 crc kubenswrapper[4804]: I0217 13:51:37.552941 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:37 crc kubenswrapper[4804]: I0217 13:51:37.607252 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:37 crc kubenswrapper[4804]: I0217 13:51:37.796222 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nb9wd"] Feb 17 13:51:39 crc kubenswrapper[4804]: I0217 13:51:39.541240 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nb9wd" podUID="4b5520af-e860-4937-af9c-049b304c0cf9" containerName="registry-server" containerID="cri-o://905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9" gracePeriod=2 Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.090721 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.247327 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhqb5\" (UniqueName: \"kubernetes.io/projected/4b5520af-e860-4937-af9c-049b304c0cf9-kube-api-access-bhqb5\") pod \"4b5520af-e860-4937-af9c-049b304c0cf9\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.247683 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-catalog-content\") pod \"4b5520af-e860-4937-af9c-049b304c0cf9\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.247942 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-utilities\") pod \"4b5520af-e860-4937-af9c-049b304c0cf9\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.249496 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-utilities" (OuterVolumeSpecName: "utilities") pod "4b5520af-e860-4937-af9c-049b304c0cf9" (UID: "4b5520af-e860-4937-af9c-049b304c0cf9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.253486 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b5520af-e860-4937-af9c-049b304c0cf9-kube-api-access-bhqb5" (OuterVolumeSpecName: "kube-api-access-bhqb5") pod "4b5520af-e860-4937-af9c-049b304c0cf9" (UID: "4b5520af-e860-4937-af9c-049b304c0cf9"). InnerVolumeSpecName "kube-api-access-bhqb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.271355 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b5520af-e860-4937-af9c-049b304c0cf9" (UID: "4b5520af-e860-4937-af9c-049b304c0cf9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.350544 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.350760 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhqb5\" (UniqueName: \"kubernetes.io/projected/4b5520af-e860-4937-af9c-049b304c0cf9-kube-api-access-bhqb5\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.350873 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.551508 4804 generic.go:334] "Generic (PLEG): container finished" podID="4b5520af-e860-4937-af9c-049b304c0cf9" containerID="905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9" exitCode=0 Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.551552 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nb9wd" event={"ID":"4b5520af-e860-4937-af9c-049b304c0cf9","Type":"ContainerDied","Data":"905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9"} Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.551576 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nb9wd" event={"ID":"4b5520af-e860-4937-af9c-049b304c0cf9","Type":"ContainerDied","Data":"5096b0ade58765cbb70c123fde8ddf796f5301f72982d1f2729abe092a910d91"} Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.551590 4804 scope.go:117] "RemoveContainer" containerID="905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.551709 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.588932 4804 scope.go:117] "RemoveContainer" containerID="a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.590036 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nb9wd"] Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.623429 4804 scope.go:117] "RemoveContainer" containerID="1b7acd23a0c9385c3b2b090c72ed4c69fd030cf041066eb9a7870d78777b66c0" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.639676 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nb9wd"] Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.686360 4804 scope.go:117] "RemoveContainer" containerID="905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9" Feb 17 13:51:40 crc kubenswrapper[4804]: E0217 13:51:40.695380 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9\": container with ID starting with 905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9 not found: ID does not exist" containerID="905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.695432 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9"} err="failed to get container status \"905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9\": rpc error: code = NotFound desc = could not find container \"905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9\": container with ID starting with 905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9 not found: ID does not exist" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.695465 4804 scope.go:117] "RemoveContainer" containerID="a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4" Feb 17 13:51:40 crc kubenswrapper[4804]: E0217 13:51:40.699348 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4\": container with ID starting with a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4 not found: ID does not exist" containerID="a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.699393 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4"} err="failed to get container status \"a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4\": rpc error: code = NotFound desc = could not find container \"a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4\": container with ID starting with a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4 not found: ID does not exist" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.699418 4804 scope.go:117] "RemoveContainer" containerID="1b7acd23a0c9385c3b2b090c72ed4c69fd030cf041066eb9a7870d78777b66c0" Feb 17 13:51:40 crc kubenswrapper[4804]: E0217 13:51:40.710405 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b7acd23a0c9385c3b2b090c72ed4c69fd030cf041066eb9a7870d78777b66c0\": container with ID starting with 1b7acd23a0c9385c3b2b090c72ed4c69fd030cf041066eb9a7870d78777b66c0 not found: ID does not exist" containerID="1b7acd23a0c9385c3b2b090c72ed4c69fd030cf041066eb9a7870d78777b66c0" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.710458 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b7acd23a0c9385c3b2b090c72ed4c69fd030cf041066eb9a7870d78777b66c0"} err="failed to get container status \"1b7acd23a0c9385c3b2b090c72ed4c69fd030cf041066eb9a7870d78777b66c0\": rpc error: code = NotFound desc = could not find container \"1b7acd23a0c9385c3b2b090c72ed4c69fd030cf041066eb9a7870d78777b66c0\": container with ID starting with 1b7acd23a0c9385c3b2b090c72ed4c69fd030cf041066eb9a7870d78777b66c0 not found: ID does not exist" Feb 17 13:51:42 crc kubenswrapper[4804]: I0217 13:51:42.584421 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b5520af-e860-4937-af9c-049b304c0cf9" path="/var/lib/kubelet/pods/4b5520af-e860-4937-af9c-049b304c0cf9/volumes" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.249163 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d49qr"] Feb 17 13:51:59 crc kubenswrapper[4804]: E0217 13:51:59.251477 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b5520af-e860-4937-af9c-049b304c0cf9" containerName="registry-server" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.251584 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b5520af-e860-4937-af9c-049b304c0cf9" containerName="registry-server" Feb 17 13:51:59 crc kubenswrapper[4804]: E0217 13:51:59.251671 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b5520af-e860-4937-af9c-049b304c0cf9" containerName="extract-content" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.251742 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b5520af-e860-4937-af9c-049b304c0cf9" containerName="extract-content" Feb 17 13:51:59 crc kubenswrapper[4804]: E0217 13:51:59.251838 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b5520af-e860-4937-af9c-049b304c0cf9" containerName="extract-utilities" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.251907 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b5520af-e860-4937-af9c-049b304c0cf9" containerName="extract-utilities" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.252188 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b5520af-e860-4937-af9c-049b304c0cf9" containerName="registry-server" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.254271 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.281444 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d49qr"] Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.339243 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-utilities\") pod \"community-operators-d49qr\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.339540 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-catalog-content\") pod \"community-operators-d49qr\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.339687 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxrbm\" (UniqueName: \"kubernetes.io/projected/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-kube-api-access-wxrbm\") pod \"community-operators-d49qr\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.442241 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxrbm\" (UniqueName: \"kubernetes.io/projected/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-kube-api-access-wxrbm\") pod \"community-operators-d49qr\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.442398 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-utilities\") pod \"community-operators-d49qr\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.442492 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-catalog-content\") pod \"community-operators-d49qr\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.442996 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-utilities\") pod \"community-operators-d49qr\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.443410 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-catalog-content\") pod \"community-operators-d49qr\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.467436 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxrbm\" (UniqueName: \"kubernetes.io/projected/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-kube-api-access-wxrbm\") pod \"community-operators-d49qr\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.581899 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:52:00 crc kubenswrapper[4804]: I0217 13:52:00.297642 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d49qr"] Feb 17 13:52:00 crc kubenswrapper[4804]: I0217 13:52:00.755178 4804 generic.go:334] "Generic (PLEG): container finished" podID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" containerID="80882e9d11ff70b7f4229ea7891e6b71a03c35cd89308d5ce5725493e12e7420" exitCode=0 Feb 17 13:52:00 crc kubenswrapper[4804]: I0217 13:52:00.755254 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d49qr" event={"ID":"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b","Type":"ContainerDied","Data":"80882e9d11ff70b7f4229ea7891e6b71a03c35cd89308d5ce5725493e12e7420"} Feb 17 13:52:00 crc kubenswrapper[4804]: I0217 13:52:00.755290 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d49qr" event={"ID":"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b","Type":"ContainerStarted","Data":"ad252798f89f9f5003c362f37cb3de655136fdb2a16e7eaa3681ff60f9f272d2"} Feb 17 13:52:02 crc kubenswrapper[4804]: I0217 13:52:02.784987 4804 generic.go:334] "Generic (PLEG): container finished" podID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" containerID="c52aef2a5b0f26c58811f5709c4e506ccb93b90fb40970df9641fe23a08e0c31" exitCode=0 Feb 17 13:52:02 crc kubenswrapper[4804]: I0217 13:52:02.785186 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d49qr" event={"ID":"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b","Type":"ContainerDied","Data":"c52aef2a5b0f26c58811f5709c4e506ccb93b90fb40970df9641fe23a08e0c31"} Feb 17 13:52:03 crc kubenswrapper[4804]: I0217 13:52:03.797092 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d49qr" event={"ID":"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b","Type":"ContainerStarted","Data":"b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05"} Feb 17 13:52:03 crc kubenswrapper[4804]: I0217 13:52:03.816427 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d49qr" podStartSLOduration=2.351987312 podStartE2EDuration="4.816402136s" podCreationTimestamp="2026-02-17 13:51:59 +0000 UTC" firstStartedPulling="2026-02-17 13:52:00.757067072 +0000 UTC m=+1594.868486409" lastFinishedPulling="2026-02-17 13:52:03.221481906 +0000 UTC m=+1597.332901233" observedRunningTime="2026-02-17 13:52:03.812276947 +0000 UTC m=+1597.923696274" watchObservedRunningTime="2026-02-17 13:52:03.816402136 +0000 UTC m=+1597.927821473" Feb 17 13:52:09 crc kubenswrapper[4804]: I0217 13:52:09.584506 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:52:09 crc kubenswrapper[4804]: I0217 13:52:09.585025 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:52:09 crc kubenswrapper[4804]: I0217 13:52:09.632491 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:52:09 crc kubenswrapper[4804]: I0217 13:52:09.897185 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:52:10 crc kubenswrapper[4804]: I0217 13:52:10.003922 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d49qr"] Feb 17 13:52:11 crc kubenswrapper[4804]: I0217 13:52:11.868310 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d49qr" podUID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" containerName="registry-server" containerID="cri-o://b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05" gracePeriod=2 Feb 17 13:52:12 crc kubenswrapper[4804]: E0217 13:52:12.131490 4804 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b31d2b3_4599_40a8_b1c0_3f0f795cd13b.slice/crio-b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05.scope\": RecentStats: unable to find data in memory cache]" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.314938 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.406700 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-utilities\") pod \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.406915 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxrbm\" (UniqueName: \"kubernetes.io/projected/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-kube-api-access-wxrbm\") pod \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.406998 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-catalog-content\") pod \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.407930 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-utilities" (OuterVolumeSpecName: "utilities") pod "8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" (UID: "8b31d2b3-4599-40a8-b1c0-3f0f795cd13b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.417161 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-kube-api-access-wxrbm" (OuterVolumeSpecName: "kube-api-access-wxrbm") pod "8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" (UID: "8b31d2b3-4599-40a8-b1c0-3f0f795cd13b"). InnerVolumeSpecName "kube-api-access-wxrbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.454456 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" (UID: "8b31d2b3-4599-40a8-b1c0-3f0f795cd13b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.509246 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxrbm\" (UniqueName: \"kubernetes.io/projected/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-kube-api-access-wxrbm\") on node \"crc\" DevicePath \"\"" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.509285 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.509295 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.878831 4804 generic.go:334] "Generic (PLEG): container finished" podID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" containerID="b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05" exitCode=0 Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.878874 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d49qr" event={"ID":"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b","Type":"ContainerDied","Data":"b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05"} Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.878899 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d49qr" event={"ID":"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b","Type":"ContainerDied","Data":"ad252798f89f9f5003c362f37cb3de655136fdb2a16e7eaa3681ff60f9f272d2"} Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.878915 4804 scope.go:117] "RemoveContainer" containerID="b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.878931 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.903792 4804 scope.go:117] "RemoveContainer" containerID="c52aef2a5b0f26c58811f5709c4e506ccb93b90fb40970df9641fe23a08e0c31" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.912244 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d49qr"] Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.931155 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d49qr"] Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.931214 4804 scope.go:117] "RemoveContainer" containerID="80882e9d11ff70b7f4229ea7891e6b71a03c35cd89308d5ce5725493e12e7420" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.968476 4804 scope.go:117] "RemoveContainer" containerID="b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05" Feb 17 13:52:12 crc kubenswrapper[4804]: E0217 13:52:12.969811 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05\": container with ID starting with b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05 not found: ID does not exist" containerID="b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.969850 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05"} err="failed to get container status \"b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05\": rpc error: code = NotFound desc = could not find container \"b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05\": container with ID starting with b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05 not found: ID does not exist" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.969876 4804 scope.go:117] "RemoveContainer" containerID="c52aef2a5b0f26c58811f5709c4e506ccb93b90fb40970df9641fe23a08e0c31" Feb 17 13:52:12 crc kubenswrapper[4804]: E0217 13:52:12.970313 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c52aef2a5b0f26c58811f5709c4e506ccb93b90fb40970df9641fe23a08e0c31\": container with ID starting with c52aef2a5b0f26c58811f5709c4e506ccb93b90fb40970df9641fe23a08e0c31 not found: ID does not exist" containerID="c52aef2a5b0f26c58811f5709c4e506ccb93b90fb40970df9641fe23a08e0c31" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.970354 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c52aef2a5b0f26c58811f5709c4e506ccb93b90fb40970df9641fe23a08e0c31"} err="failed to get container status \"c52aef2a5b0f26c58811f5709c4e506ccb93b90fb40970df9641fe23a08e0c31\": rpc error: code = NotFound desc = could not find container \"c52aef2a5b0f26c58811f5709c4e506ccb93b90fb40970df9641fe23a08e0c31\": container with ID starting with c52aef2a5b0f26c58811f5709c4e506ccb93b90fb40970df9641fe23a08e0c31 not found: ID does not exist" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.970380 4804 scope.go:117] "RemoveContainer" containerID="80882e9d11ff70b7f4229ea7891e6b71a03c35cd89308d5ce5725493e12e7420" Feb 17 13:52:12 crc kubenswrapper[4804]: E0217 13:52:12.970736 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80882e9d11ff70b7f4229ea7891e6b71a03c35cd89308d5ce5725493e12e7420\": container with ID starting with 80882e9d11ff70b7f4229ea7891e6b71a03c35cd89308d5ce5725493e12e7420 not found: ID does not exist" containerID="80882e9d11ff70b7f4229ea7891e6b71a03c35cd89308d5ce5725493e12e7420" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.970763 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80882e9d11ff70b7f4229ea7891e6b71a03c35cd89308d5ce5725493e12e7420"} err="failed to get container status \"80882e9d11ff70b7f4229ea7891e6b71a03c35cd89308d5ce5725493e12e7420\": rpc error: code = NotFound desc = could not find container \"80882e9d11ff70b7f4229ea7891e6b71a03c35cd89308d5ce5725493e12e7420\": container with ID starting with 80882e9d11ff70b7f4229ea7891e6b71a03c35cd89308d5ce5725493e12e7420 not found: ID does not exist" Feb 17 13:52:14 crc kubenswrapper[4804]: I0217 13:52:14.585961 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" path="/var/lib/kubelet/pods/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b/volumes" Feb 17 13:52:25 crc kubenswrapper[4804]: I0217 13:52:25.835453 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:52:25 crc kubenswrapper[4804]: I0217 13:52:25.835864 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:52:28 crc kubenswrapper[4804]: I0217 13:52:28.980663 4804 scope.go:117] "RemoveContainer" containerID="fe58294f85ff06a0d32971760c88b3a7d0ebe711d822c93e180307f22e74f6a0" Feb 17 13:52:29 crc kubenswrapper[4804]: I0217 13:52:29.014601 4804 scope.go:117] "RemoveContainer" containerID="dca62d14dda6868e926b57148e8cd74b64e632384abd99e1788d3d27c22c4765" Feb 17 13:52:29 crc kubenswrapper[4804]: I0217 13:52:29.056668 4804 scope.go:117] "RemoveContainer" containerID="c204297abebd9a53145ab03c24cc8848ddb7478ea7164daa834f5efc7f82083d" Feb 17 13:52:29 crc kubenswrapper[4804]: I0217 13:52:29.082931 4804 scope.go:117] "RemoveContainer" containerID="525c6762f9ba8180d2f6b437538441d8677513d8b708766e65a25901daeb816c" Feb 17 13:52:29 crc kubenswrapper[4804]: I0217 13:52:29.139727 4804 scope.go:117] "RemoveContainer" containerID="b7047f4fb5cc51bf92eedf0304d4d9a035247692d44844e1d6c89de23d58aef4" Feb 17 13:52:55 crc kubenswrapper[4804]: I0217 13:52:55.835804 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:52:55 crc kubenswrapper[4804]: I0217 13:52:55.836445 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:53:25 crc kubenswrapper[4804]: I0217 13:53:25.835756 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:53:25 crc kubenswrapper[4804]: I0217 13:53:25.837859 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:53:25 crc kubenswrapper[4804]: I0217 13:53:25.838428 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:53:25 crc kubenswrapper[4804]: I0217 13:53:25.839682 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 13:53:25 crc kubenswrapper[4804]: I0217 13:53:25.839997 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" gracePeriod=600 Feb 17 13:53:25 crc kubenswrapper[4804]: E0217 13:53:25.967517 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:53:26 crc kubenswrapper[4804]: I0217 13:53:26.618935 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" exitCode=0 Feb 17 13:53:26 crc kubenswrapper[4804]: I0217 13:53:26.619024 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687"} Feb 17 13:53:26 crc kubenswrapper[4804]: I0217 13:53:26.619337 4804 scope.go:117] "RemoveContainer" containerID="845afb2d1d32c8f1c4420bf9c6d30ae92d7fd53810dea6b094c1e266f88044e6" Feb 17 13:53:26 crc kubenswrapper[4804]: I0217 13:53:26.620027 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:53:26 crc kubenswrapper[4804]: E0217 13:53:26.620356 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:53:29 crc kubenswrapper[4804]: I0217 13:53:29.301535 4804 scope.go:117] "RemoveContainer" containerID="723476fd1d8f467255808440fe7e8799143ee2007a7f138345fcc04e2663bf99" Feb 17 13:53:29 crc kubenswrapper[4804]: I0217 13:53:29.331751 4804 scope.go:117] "RemoveContainer" containerID="2b63a870c70f085dee0bf900b7beba65015e3aff6e6541b29544712e34dd77a9" Feb 17 13:53:29 crc kubenswrapper[4804]: I0217 13:53:29.360924 4804 scope.go:117] "RemoveContainer" containerID="8d6d4b8225dc05b2f8ac6fe66b04d57f0e324f2f754fb6ddc82de82d73688709" Feb 17 13:53:41 crc kubenswrapper[4804]: I0217 13:53:41.575503 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:53:41 crc kubenswrapper[4804]: E0217 13:53:41.576416 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:53:55 crc kubenswrapper[4804]: I0217 13:53:55.574819 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:53:55 crc kubenswrapper[4804]: E0217 13:53:55.575609 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:54:09 crc kubenswrapper[4804]: I0217 13:54:09.574184 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:54:09 crc kubenswrapper[4804]: E0217 13:54:09.574828 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:54:22 crc kubenswrapper[4804]: I0217 13:54:22.574876 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:54:22 crc kubenswrapper[4804]: E0217 13:54:22.576318 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:54:30 crc kubenswrapper[4804]: I0217 13:54:30.256655 4804 generic.go:334] "Generic (PLEG): container finished" podID="9ee075c2-2363-4446-8545-dfdece6ca4da" containerID="c30c97b714db6eaaea3d99e426020e3d5b0cd168a7762b36fc6e65e7574bc11f" exitCode=0 Feb 17 13:54:30 crc kubenswrapper[4804]: I0217 13:54:30.256729 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" event={"ID":"9ee075c2-2363-4446-8545-dfdece6ca4da","Type":"ContainerDied","Data":"c30c97b714db6eaaea3d99e426020e3d5b0cd168a7762b36fc6e65e7574bc11f"} Feb 17 13:54:31 crc kubenswrapper[4804]: I0217 13:54:31.773846 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:54:31 crc kubenswrapper[4804]: I0217 13:54:31.937946 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpnt6\" (UniqueName: \"kubernetes.io/projected/9ee075c2-2363-4446-8545-dfdece6ca4da-kube-api-access-bpnt6\") pod \"9ee075c2-2363-4446-8545-dfdece6ca4da\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " Feb 17 13:54:31 crc kubenswrapper[4804]: I0217 13:54:31.938182 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-bootstrap-combined-ca-bundle\") pod \"9ee075c2-2363-4446-8545-dfdece6ca4da\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " Feb 17 13:54:31 crc kubenswrapper[4804]: I0217 13:54:31.938417 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-ssh-key-openstack-edpm-ipam\") pod \"9ee075c2-2363-4446-8545-dfdece6ca4da\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " Feb 17 13:54:31 crc kubenswrapper[4804]: I0217 13:54:31.938479 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-inventory\") pod \"9ee075c2-2363-4446-8545-dfdece6ca4da\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " Feb 17 13:54:31 crc kubenswrapper[4804]: I0217 13:54:31.944865 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ee075c2-2363-4446-8545-dfdece6ca4da-kube-api-access-bpnt6" (OuterVolumeSpecName: "kube-api-access-bpnt6") pod "9ee075c2-2363-4446-8545-dfdece6ca4da" (UID: "9ee075c2-2363-4446-8545-dfdece6ca4da"). InnerVolumeSpecName "kube-api-access-bpnt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:54:31 crc kubenswrapper[4804]: I0217 13:54:31.947503 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9ee075c2-2363-4446-8545-dfdece6ca4da" (UID: "9ee075c2-2363-4446-8545-dfdece6ca4da"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:54:31 crc kubenswrapper[4804]: I0217 13:54:31.966448 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-inventory" (OuterVolumeSpecName: "inventory") pod "9ee075c2-2363-4446-8545-dfdece6ca4da" (UID: "9ee075c2-2363-4446-8545-dfdece6ca4da"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:54:31 crc kubenswrapper[4804]: I0217 13:54:31.971067 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9ee075c2-2363-4446-8545-dfdece6ca4da" (UID: "9ee075c2-2363-4446-8545-dfdece6ca4da"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.040848 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpnt6\" (UniqueName: \"kubernetes.io/projected/9ee075c2-2363-4446-8545-dfdece6ca4da-kube-api-access-bpnt6\") on node \"crc\" DevicePath \"\"" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.041505 4804 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.041526 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.041536 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.302029 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" event={"ID":"9ee075c2-2363-4446-8545-dfdece6ca4da","Type":"ContainerDied","Data":"18b9b9e278e8c83dcf92ce23e64cdae2695221512b4169c9866fb3e6753d918e"} Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.302069 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18b9b9e278e8c83dcf92ce23e64cdae2695221512b4169c9866fb3e6753d918e" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.302113 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.463816 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc"] Feb 17 13:54:32 crc kubenswrapper[4804]: E0217 13:54:32.464227 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" containerName="extract-utilities" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.464242 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" containerName="extract-utilities" Feb 17 13:54:32 crc kubenswrapper[4804]: E0217 13:54:32.464257 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" containerName="registry-server" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.464264 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" containerName="registry-server" Feb 17 13:54:32 crc kubenswrapper[4804]: E0217 13:54:32.464278 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee075c2-2363-4446-8545-dfdece6ca4da" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.464285 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee075c2-2363-4446-8545-dfdece6ca4da" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 17 13:54:32 crc kubenswrapper[4804]: E0217 13:54:32.464301 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" containerName="extract-content" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.464306 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" containerName="extract-content" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.464472 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ee075c2-2363-4446-8545-dfdece6ca4da" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.464485 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" containerName="registry-server" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.465035 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.467491 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.468162 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.468365 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.468520 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.480519 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc"] Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.651848 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.651897 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdrg5\" (UniqueName: \"kubernetes.io/projected/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-kube-api-access-mdrg5\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.652678 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.756012 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.756062 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdrg5\" (UniqueName: \"kubernetes.io/projected/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-kube-api-access-mdrg5\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.756112 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.760599 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.760625 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.779950 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdrg5\" (UniqueName: \"kubernetes.io/projected/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-kube-api-access-mdrg5\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.783925 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:54:33 crc kubenswrapper[4804]: I0217 13:54:33.282333 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc"] Feb 17 13:54:33 crc kubenswrapper[4804]: I0217 13:54:33.312971 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" event={"ID":"5ecc3e55-21c0-4017-8dce-9c77fd2189ea","Type":"ContainerStarted","Data":"8403bf1cd856d5e0daf8826b7d963b23d0c64b1af10a42c915eb4e8853a3f40b"} Feb 17 13:54:34 crc kubenswrapper[4804]: I0217 13:54:34.332281 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" event={"ID":"5ecc3e55-21c0-4017-8dce-9c77fd2189ea","Type":"ContainerStarted","Data":"c179411b8526961212962ec76e7aa2e295a2ad91f22528c3d54a0da09b716dc4"} Feb 17 13:54:34 crc kubenswrapper[4804]: I0217 13:54:34.362894 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" podStartSLOduration=2.177384309 podStartE2EDuration="2.362865706s" podCreationTimestamp="2026-02-17 13:54:32 +0000 UTC" firstStartedPulling="2026-02-17 13:54:33.288236524 +0000 UTC m=+1747.399655861" lastFinishedPulling="2026-02-17 13:54:33.473717901 +0000 UTC m=+1747.585137258" observedRunningTime="2026-02-17 13:54:34.355434482 +0000 UTC m=+1748.466853849" watchObservedRunningTime="2026-02-17 13:54:34.362865706 +0000 UTC m=+1748.474285073" Feb 17 13:54:35 crc kubenswrapper[4804]: I0217 13:54:35.573628 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:54:35 crc kubenswrapper[4804]: E0217 13:54:35.574159 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:54:49 crc kubenswrapper[4804]: I0217 13:54:49.574246 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:54:49 crc kubenswrapper[4804]: E0217 13:54:49.574962 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:55:01 crc kubenswrapper[4804]: I0217 13:55:01.574337 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:55:01 crc kubenswrapper[4804]: E0217 13:55:01.575110 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:55:14 crc kubenswrapper[4804]: I0217 13:55:14.574440 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:55:14 crc kubenswrapper[4804]: E0217 13:55:14.575639 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:55:26 crc kubenswrapper[4804]: I0217 13:55:26.580902 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:55:26 crc kubenswrapper[4804]: E0217 13:55:26.581623 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:55:37 crc kubenswrapper[4804]: I0217 13:55:37.574727 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:55:37 crc kubenswrapper[4804]: E0217 13:55:37.575579 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:55:39 crc kubenswrapper[4804]: I0217 13:55:39.052107 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-dl5b9"] Feb 17 13:55:39 crc kubenswrapper[4804]: I0217 13:55:39.063619 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-886b-account-create-update-h84mx"] Feb 17 13:55:39 crc kubenswrapper[4804]: I0217 13:55:39.071895 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-0898-account-create-update-6vpd7"] Feb 17 13:55:39 crc kubenswrapper[4804]: I0217 13:55:39.079806 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-886b-account-create-update-h84mx"] Feb 17 13:55:39 crc kubenswrapper[4804]: I0217 13:55:39.088035 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-0898-account-create-update-6vpd7"] Feb 17 13:55:39 crc kubenswrapper[4804]: I0217 13:55:39.096346 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-6m6pk"] Feb 17 13:55:39 crc kubenswrapper[4804]: I0217 13:55:39.105032 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-dl5b9"] Feb 17 13:55:39 crc kubenswrapper[4804]: I0217 13:55:39.114292 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-6m6pk"] Feb 17 13:55:40 crc kubenswrapper[4804]: I0217 13:55:40.583824 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2edd89a7-0866-4677-8b25-9654130c6ac5" path="/var/lib/kubelet/pods/2edd89a7-0866-4677-8b25-9654130c6ac5/volumes" Feb 17 13:55:40 crc kubenswrapper[4804]: I0217 13:55:40.584542 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bc37bd5-6784-41f8-98de-ef6a43493cd6" path="/var/lib/kubelet/pods/4bc37bd5-6784-41f8-98de-ef6a43493cd6/volumes" Feb 17 13:55:40 crc kubenswrapper[4804]: I0217 13:55:40.585021 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f9dbe9b-ced6-453d-9f59-0d92e2a69043" path="/var/lib/kubelet/pods/6f9dbe9b-ced6-453d-9f59-0d92e2a69043/volumes" Feb 17 13:55:40 crc kubenswrapper[4804]: I0217 13:55:40.585662 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba7e6539-c0c9-40e7-b076-38cc23f233cc" path="/var/lib/kubelet/pods/ba7e6539-c0c9-40e7-b076-38cc23f233cc/volumes" Feb 17 13:55:43 crc kubenswrapper[4804]: I0217 13:55:43.026500 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f8a9-account-create-update-98wtk"] Feb 17 13:55:43 crc kubenswrapper[4804]: I0217 13:55:43.034146 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-v8tb5"] Feb 17 13:55:43 crc kubenswrapper[4804]: I0217 13:55:43.060260 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f8a9-account-create-update-98wtk"] Feb 17 13:55:43 crc kubenswrapper[4804]: I0217 13:55:43.068994 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-v8tb5"] Feb 17 13:55:44 crc kubenswrapper[4804]: I0217 13:55:44.588693 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c8ee09a-97bd-4497-81cd-2f0f4952d996" path="/var/lib/kubelet/pods/4c8ee09a-97bd-4497-81cd-2f0f4952d996/volumes" Feb 17 13:55:44 crc kubenswrapper[4804]: I0217 13:55:44.590686 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0597f43-df0a-427f-b045-e6859849a0d6" path="/var/lib/kubelet/pods/b0597f43-df0a-427f-b045-e6859849a0d6/volumes" Feb 17 13:55:49 crc kubenswrapper[4804]: I0217 13:55:49.494505 4804 generic.go:334] "Generic (PLEG): container finished" podID="5ecc3e55-21c0-4017-8dce-9c77fd2189ea" containerID="c179411b8526961212962ec76e7aa2e295a2ad91f22528c3d54a0da09b716dc4" exitCode=0 Feb 17 13:55:49 crc kubenswrapper[4804]: I0217 13:55:49.494584 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" event={"ID":"5ecc3e55-21c0-4017-8dce-9c77fd2189ea","Type":"ContainerDied","Data":"c179411b8526961212962ec76e7aa2e295a2ad91f22528c3d54a0da09b716dc4"} Feb 17 13:55:50 crc kubenswrapper[4804]: I0217 13:55:50.963258 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.018546 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdrg5\" (UniqueName: \"kubernetes.io/projected/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-kube-api-access-mdrg5\") pod \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.019148 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-ssh-key-openstack-edpm-ipam\") pod \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.019448 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-inventory\") pod \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.046460 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-kube-api-access-mdrg5" (OuterVolumeSpecName: "kube-api-access-mdrg5") pod "5ecc3e55-21c0-4017-8dce-9c77fd2189ea" (UID: "5ecc3e55-21c0-4017-8dce-9c77fd2189ea"). InnerVolumeSpecName "kube-api-access-mdrg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.068509 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-inventory" (OuterVolumeSpecName: "inventory") pod "5ecc3e55-21c0-4017-8dce-9c77fd2189ea" (UID: "5ecc3e55-21c0-4017-8dce-9c77fd2189ea"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.091278 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-c8wmz"] Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.109568 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-c8wmz"] Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.117678 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5ecc3e55-21c0-4017-8dce-9c77fd2189ea" (UID: "5ecc3e55-21c0-4017-8dce-9c77fd2189ea"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.121620 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdrg5\" (UniqueName: \"kubernetes.io/projected/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-kube-api-access-mdrg5\") on node \"crc\" DevicePath \"\"" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.121656 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.121668 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.519036 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" event={"ID":"5ecc3e55-21c0-4017-8dce-9c77fd2189ea","Type":"ContainerDied","Data":"8403bf1cd856d5e0daf8826b7d963b23d0c64b1af10a42c915eb4e8853a3f40b"} Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.519079 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8403bf1cd856d5e0daf8826b7d963b23d0c64b1af10a42c915eb4e8853a3f40b" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.519140 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.649681 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq"] Feb 17 13:55:51 crc kubenswrapper[4804]: E0217 13:55:51.650446 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ecc3e55-21c0-4017-8dce-9c77fd2189ea" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.650462 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ecc3e55-21c0-4017-8dce-9c77fd2189ea" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.651052 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ecc3e55-21c0-4017-8dce-9c77fd2189ea" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.651862 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.655865 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.656217 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.656346 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.656465 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.665262 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq"] Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.838672 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-499xq\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.838909 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-499xq\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.838995 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5854r\" (UniqueName: \"kubernetes.io/projected/5c4e88aa-842f-453a-9ce9-8354c16340e9-kube-api-access-5854r\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-499xq\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.940433 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-499xq\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.940516 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5854r\" (UniqueName: \"kubernetes.io/projected/5c4e88aa-842f-453a-9ce9-8354c16340e9-kube-api-access-5854r\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-499xq\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.940596 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-499xq\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.944926 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-499xq\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.945123 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-499xq\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.964374 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5854r\" (UniqueName: \"kubernetes.io/projected/5c4e88aa-842f-453a-9ce9-8354c16340e9-kube-api-access-5854r\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-499xq\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.976057 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:55:52 crc kubenswrapper[4804]: I0217 13:55:52.489542 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq"] Feb 17 13:55:52 crc kubenswrapper[4804]: I0217 13:55:52.537592 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" event={"ID":"5c4e88aa-842f-453a-9ce9-8354c16340e9","Type":"ContainerStarted","Data":"c1ff2f72248d2caefe219eeeda003dc9466862a80496ba52f9eda2e288c07614"} Feb 17 13:55:52 crc kubenswrapper[4804]: I0217 13:55:52.575190 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:55:52 crc kubenswrapper[4804]: E0217 13:55:52.575552 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:55:52 crc kubenswrapper[4804]: I0217 13:55:52.585055 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c3b824f-ae3d-4681-8b14-16099a2643d5" path="/var/lib/kubelet/pods/6c3b824f-ae3d-4681-8b14-16099a2643d5/volumes" Feb 17 13:55:53 crc kubenswrapper[4804]: I0217 13:55:53.547363 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" event={"ID":"5c4e88aa-842f-453a-9ce9-8354c16340e9","Type":"ContainerStarted","Data":"8502d67bffeaa50b847ab11945f925a93eebe7c2984cfa9357e9a9dabde733e2"} Feb 17 13:55:53 crc kubenswrapper[4804]: I0217 13:55:53.563703 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" podStartSLOduration=2.3661311400000002 podStartE2EDuration="2.56368516s" podCreationTimestamp="2026-02-17 13:55:51 +0000 UTC" firstStartedPulling="2026-02-17 13:55:52.496998982 +0000 UTC m=+1826.608418319" lastFinishedPulling="2026-02-17 13:55:52.694553002 +0000 UTC m=+1826.805972339" observedRunningTime="2026-02-17 13:55:53.562559936 +0000 UTC m=+1827.673979273" watchObservedRunningTime="2026-02-17 13:55:53.56368516 +0000 UTC m=+1827.675104497" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.041575 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lvbbn"] Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.046937 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.058476 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lvbbn"] Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.158129 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-catalog-content\") pod \"redhat-operators-lvbbn\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.158189 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w97tz\" (UniqueName: \"kubernetes.io/projected/59301759-1bac-4d09-97be-b829e799b4d8-kube-api-access-w97tz\") pod \"redhat-operators-lvbbn\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.158252 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-utilities\") pod \"redhat-operators-lvbbn\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.260233 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-catalog-content\") pod \"redhat-operators-lvbbn\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.260533 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w97tz\" (UniqueName: \"kubernetes.io/projected/59301759-1bac-4d09-97be-b829e799b4d8-kube-api-access-w97tz\") pod \"redhat-operators-lvbbn\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.260703 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-utilities\") pod \"redhat-operators-lvbbn\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.260735 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-catalog-content\") pod \"redhat-operators-lvbbn\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.260949 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-utilities\") pod \"redhat-operators-lvbbn\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.280936 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w97tz\" (UniqueName: \"kubernetes.io/projected/59301759-1bac-4d09-97be-b829e799b4d8-kube-api-access-w97tz\") pod \"redhat-operators-lvbbn\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.373721 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.573660 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:56:03 crc kubenswrapper[4804]: E0217 13:56:03.574174 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.848575 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lvbbn"] Feb 17 13:56:04 crc kubenswrapper[4804]: I0217 13:56:04.650262 4804 generic.go:334] "Generic (PLEG): container finished" podID="59301759-1bac-4d09-97be-b829e799b4d8" containerID="3c06e138999c99a4ab47410291000b7b473db950ccf9dfc1efe15e8c156155f4" exitCode=0 Feb 17 13:56:04 crc kubenswrapper[4804]: I0217 13:56:04.650350 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvbbn" event={"ID":"59301759-1bac-4d09-97be-b829e799b4d8","Type":"ContainerDied","Data":"3c06e138999c99a4ab47410291000b7b473db950ccf9dfc1efe15e8c156155f4"} Feb 17 13:56:04 crc kubenswrapper[4804]: I0217 13:56:04.650540 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvbbn" event={"ID":"59301759-1bac-4d09-97be-b829e799b4d8","Type":"ContainerStarted","Data":"fdf24c14126994bd4aa4f0024928980d88141f189583e3981f58728c0a0db1c4"} Feb 17 13:56:04 crc kubenswrapper[4804]: I0217 13:56:04.653228 4804 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 13:56:08 crc kubenswrapper[4804]: I0217 13:56:08.040990 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-lpd9f"] Feb 17 13:56:08 crc kubenswrapper[4804]: I0217 13:56:08.052135 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-lpd9f"] Feb 17 13:56:08 crc kubenswrapper[4804]: I0217 13:56:08.589576 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfb6c8ec-f280-4566-bb37-b286119956b5" path="/var/lib/kubelet/pods/dfb6c8ec-f280-4566-bb37-b286119956b5/volumes" Feb 17 13:56:11 crc kubenswrapper[4804]: I0217 13:56:11.718339 4804 generic.go:334] "Generic (PLEG): container finished" podID="59301759-1bac-4d09-97be-b829e799b4d8" containerID="b3dfdc9369ed3a2971fc0782de468592b129b07a3e522a6a0c2e13ee15dd3f87" exitCode=0 Feb 17 13:56:11 crc kubenswrapper[4804]: I0217 13:56:11.718464 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvbbn" event={"ID":"59301759-1bac-4d09-97be-b829e799b4d8","Type":"ContainerDied","Data":"b3dfdc9369ed3a2971fc0782de468592b129b07a3e522a6a0c2e13ee15dd3f87"} Feb 17 13:56:12 crc kubenswrapper[4804]: I0217 13:56:12.730950 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvbbn" event={"ID":"59301759-1bac-4d09-97be-b829e799b4d8","Type":"ContainerStarted","Data":"c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e"} Feb 17 13:56:12 crc kubenswrapper[4804]: I0217 13:56:12.764885 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lvbbn" podStartSLOduration=2.212643188 podStartE2EDuration="9.764867609s" podCreationTimestamp="2026-02-17 13:56:03 +0000 UTC" firstStartedPulling="2026-02-17 13:56:04.652968601 +0000 UTC m=+1838.764387938" lastFinishedPulling="2026-02-17 13:56:12.205193032 +0000 UTC m=+1846.316612359" observedRunningTime="2026-02-17 13:56:12.756632806 +0000 UTC m=+1846.868052143" watchObservedRunningTime="2026-02-17 13:56:12.764867609 +0000 UTC m=+1846.876286946" Feb 17 13:56:13 crc kubenswrapper[4804]: I0217 13:56:13.374658 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:13 crc kubenswrapper[4804]: I0217 13:56:13.375032 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:14 crc kubenswrapper[4804]: I0217 13:56:14.430821 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lvbbn" podUID="59301759-1bac-4d09-97be-b829e799b4d8" containerName="registry-server" probeResult="failure" output=< Feb 17 13:56:14 crc kubenswrapper[4804]: timeout: failed to connect service ":50051" within 1s Feb 17 13:56:14 crc kubenswrapper[4804]: > Feb 17 13:56:15 crc kubenswrapper[4804]: I0217 13:56:15.025527 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-ncwmc"] Feb 17 13:56:15 crc kubenswrapper[4804]: I0217 13:56:15.034492 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-ncwmc"] Feb 17 13:56:16 crc kubenswrapper[4804]: I0217 13:56:16.591864 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4895769c-ef45-40c8-a8ae-0c5cb954dab2" path="/var/lib/kubelet/pods/4895769c-ef45-40c8-a8ae-0c5cb954dab2/volumes" Feb 17 13:56:17 crc kubenswrapper[4804]: I0217 13:56:17.574372 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:56:17 crc kubenswrapper[4804]: E0217 13:56:17.574674 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:56:19 crc kubenswrapper[4804]: I0217 13:56:19.033592 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-7982-account-create-update-pd5b7"] Feb 17 13:56:19 crc kubenswrapper[4804]: I0217 13:56:19.042499 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-d59c-account-create-update-phgft"] Feb 17 13:56:19 crc kubenswrapper[4804]: I0217 13:56:19.053346 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-hdmw5"] Feb 17 13:56:19 crc kubenswrapper[4804]: I0217 13:56:19.061881 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-98b2-account-create-update-648xj"] Feb 17 13:56:19 crc kubenswrapper[4804]: I0217 13:56:19.069543 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-46zbc"] Feb 17 13:56:19 crc kubenswrapper[4804]: I0217 13:56:19.077712 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-hdmw5"] Feb 17 13:56:19 crc kubenswrapper[4804]: I0217 13:56:19.084791 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-d59c-account-create-update-phgft"] Feb 17 13:56:19 crc kubenswrapper[4804]: I0217 13:56:19.091328 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-7982-account-create-update-pd5b7"] Feb 17 13:56:19 crc kubenswrapper[4804]: I0217 13:56:19.098506 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-98b2-account-create-update-648xj"] Feb 17 13:56:19 crc kubenswrapper[4804]: I0217 13:56:19.105360 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-46zbc"] Feb 17 13:56:20 crc kubenswrapper[4804]: I0217 13:56:20.591013 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26fadc7a-6cf8-4ea0-8609-50e585db4115" path="/var/lib/kubelet/pods/26fadc7a-6cf8-4ea0-8609-50e585db4115/volumes" Feb 17 13:56:20 crc kubenswrapper[4804]: I0217 13:56:20.591841 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35f1cc1f-a736-4c02-9c26-726c0c6f0d59" path="/var/lib/kubelet/pods/35f1cc1f-a736-4c02-9c26-726c0c6f0d59/volumes" Feb 17 13:56:20 crc kubenswrapper[4804]: I0217 13:56:20.592591 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60ee8426-dcbf-4430-8594-68ee778a8bbc" path="/var/lib/kubelet/pods/60ee8426-dcbf-4430-8594-68ee778a8bbc/volumes" Feb 17 13:56:20 crc kubenswrapper[4804]: I0217 13:56:20.593390 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e26c9257-7102-4d48-8999-c0a3f0ca4009" path="/var/lib/kubelet/pods/e26c9257-7102-4d48-8999-c0a3f0ca4009/volumes" Feb 17 13:56:20 crc kubenswrapper[4804]: I0217 13:56:20.594989 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e64978ab-e30e-4ebf-bce0-a8e29d5e5adc" path="/var/lib/kubelet/pods/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc/volumes" Feb 17 13:56:23 crc kubenswrapper[4804]: I0217 13:56:23.433274 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:23 crc kubenswrapper[4804]: I0217 13:56:23.501661 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:23 crc kubenswrapper[4804]: I0217 13:56:23.683091 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lvbbn"] Feb 17 13:56:24 crc kubenswrapper[4804]: I0217 13:56:24.030091 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-dgzbs"] Feb 17 13:56:24 crc kubenswrapper[4804]: I0217 13:56:24.039681 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-dgzbs"] Feb 17 13:56:24 crc kubenswrapper[4804]: I0217 13:56:24.588676 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd9036c7-1cff-4fb8-9af2-90057c4251dc" path="/var/lib/kubelet/pods/fd9036c7-1cff-4fb8-9af2-90057c4251dc/volumes" Feb 17 13:56:24 crc kubenswrapper[4804]: I0217 13:56:24.853188 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lvbbn" podUID="59301759-1bac-4d09-97be-b829e799b4d8" containerName="registry-server" containerID="cri-o://c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e" gracePeriod=2 Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.301087 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.385525 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-utilities\") pod \"59301759-1bac-4d09-97be-b829e799b4d8\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.385599 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w97tz\" (UniqueName: \"kubernetes.io/projected/59301759-1bac-4d09-97be-b829e799b4d8-kube-api-access-w97tz\") pod \"59301759-1bac-4d09-97be-b829e799b4d8\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.385801 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-catalog-content\") pod \"59301759-1bac-4d09-97be-b829e799b4d8\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.389229 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-utilities" (OuterVolumeSpecName: "utilities") pod "59301759-1bac-4d09-97be-b829e799b4d8" (UID: "59301759-1bac-4d09-97be-b829e799b4d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.393306 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59301759-1bac-4d09-97be-b829e799b4d8-kube-api-access-w97tz" (OuterVolumeSpecName: "kube-api-access-w97tz") pod "59301759-1bac-4d09-97be-b829e799b4d8" (UID: "59301759-1bac-4d09-97be-b829e799b4d8"). InnerVolumeSpecName "kube-api-access-w97tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.487800 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.487841 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w97tz\" (UniqueName: \"kubernetes.io/projected/59301759-1bac-4d09-97be-b829e799b4d8-kube-api-access-w97tz\") on node \"crc\" DevicePath \"\"" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.526501 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59301759-1bac-4d09-97be-b829e799b4d8" (UID: "59301759-1bac-4d09-97be-b829e799b4d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.593392 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.865292 4804 generic.go:334] "Generic (PLEG): container finished" podID="59301759-1bac-4d09-97be-b829e799b4d8" containerID="c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e" exitCode=0 Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.865343 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvbbn" event={"ID":"59301759-1bac-4d09-97be-b829e799b4d8","Type":"ContainerDied","Data":"c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e"} Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.865404 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvbbn" event={"ID":"59301759-1bac-4d09-97be-b829e799b4d8","Type":"ContainerDied","Data":"fdf24c14126994bd4aa4f0024928980d88141f189583e3981f58728c0a0db1c4"} Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.865421 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.865435 4804 scope.go:117] "RemoveContainer" containerID="c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.917176 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lvbbn"] Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.917315 4804 scope.go:117] "RemoveContainer" containerID="b3dfdc9369ed3a2971fc0782de468592b129b07a3e522a6a0c2e13ee15dd3f87" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.934770 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lvbbn"] Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.954662 4804 scope.go:117] "RemoveContainer" containerID="3c06e138999c99a4ab47410291000b7b473db950ccf9dfc1efe15e8c156155f4" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.984739 4804 scope.go:117] "RemoveContainer" containerID="c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e" Feb 17 13:56:25 crc kubenswrapper[4804]: E0217 13:56:25.985309 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e\": container with ID starting with c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e not found: ID does not exist" containerID="c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.985363 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e"} err="failed to get container status \"c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e\": rpc error: code = NotFound desc = could not find container \"c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e\": container with ID starting with c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e not found: ID does not exist" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.985396 4804 scope.go:117] "RemoveContainer" containerID="b3dfdc9369ed3a2971fc0782de468592b129b07a3e522a6a0c2e13ee15dd3f87" Feb 17 13:56:25 crc kubenswrapper[4804]: E0217 13:56:25.985935 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3dfdc9369ed3a2971fc0782de468592b129b07a3e522a6a0c2e13ee15dd3f87\": container with ID starting with b3dfdc9369ed3a2971fc0782de468592b129b07a3e522a6a0c2e13ee15dd3f87 not found: ID does not exist" containerID="b3dfdc9369ed3a2971fc0782de468592b129b07a3e522a6a0c2e13ee15dd3f87" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.985977 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3dfdc9369ed3a2971fc0782de468592b129b07a3e522a6a0c2e13ee15dd3f87"} err="failed to get container status \"b3dfdc9369ed3a2971fc0782de468592b129b07a3e522a6a0c2e13ee15dd3f87\": rpc error: code = NotFound desc = could not find container \"b3dfdc9369ed3a2971fc0782de468592b129b07a3e522a6a0c2e13ee15dd3f87\": container with ID starting with b3dfdc9369ed3a2971fc0782de468592b129b07a3e522a6a0c2e13ee15dd3f87 not found: ID does not exist" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.986022 4804 scope.go:117] "RemoveContainer" containerID="3c06e138999c99a4ab47410291000b7b473db950ccf9dfc1efe15e8c156155f4" Feb 17 13:56:25 crc kubenswrapper[4804]: E0217 13:56:25.986471 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c06e138999c99a4ab47410291000b7b473db950ccf9dfc1efe15e8c156155f4\": container with ID starting with 3c06e138999c99a4ab47410291000b7b473db950ccf9dfc1efe15e8c156155f4 not found: ID does not exist" containerID="3c06e138999c99a4ab47410291000b7b473db950ccf9dfc1efe15e8c156155f4" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.986499 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c06e138999c99a4ab47410291000b7b473db950ccf9dfc1efe15e8c156155f4"} err="failed to get container status \"3c06e138999c99a4ab47410291000b7b473db950ccf9dfc1efe15e8c156155f4\": rpc error: code = NotFound desc = could not find container \"3c06e138999c99a4ab47410291000b7b473db950ccf9dfc1efe15e8c156155f4\": container with ID starting with 3c06e138999c99a4ab47410291000b7b473db950ccf9dfc1efe15e8c156155f4 not found: ID does not exist" Feb 17 13:56:26 crc kubenswrapper[4804]: I0217 13:56:26.588687 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59301759-1bac-4d09-97be-b829e799b4d8" path="/var/lib/kubelet/pods/59301759-1bac-4d09-97be-b829e799b4d8/volumes" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.483520 4804 scope.go:117] "RemoveContainer" containerID="523c2a0dce1e6efc07d04ec334853ccdc0d1e041c66ee6b003b630197674d70f" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.506742 4804 scope.go:117] "RemoveContainer" containerID="9b6aded40ee8715e414f7eaa0e4d2635fac772bb7db34b9cafa3737130656836" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.556513 4804 scope.go:117] "RemoveContainer" containerID="9bdfcabbaf1ee1e250875698a377ab6bde8ce671649b12731771caa70ec454c1" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.607842 4804 scope.go:117] "RemoveContainer" containerID="95b7f32cb6985d65e04882b6a57442ea7ebdd3da00100304c3a217e8d0730df3" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.664927 4804 scope.go:117] "RemoveContainer" containerID="f94b862fb364184a245162162ecd4a81dc390800d6adb7360015eea9da137ebf" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.704003 4804 scope.go:117] "RemoveContainer" containerID="df5f178d05ce64eb60f91663ba876543b059e11efed3814a687a5cde6c71f197" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.746217 4804 scope.go:117] "RemoveContainer" containerID="195eb227b4e35d11d8a48fcc419fb067302eb3196988b8e72eeeeeb8aa5a6e2e" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.767367 4804 scope.go:117] "RemoveContainer" containerID="a042fc58bb60ee18221f1218414ff109d197e288fe316a76abf5d21b41df0c21" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.785791 4804 scope.go:117] "RemoveContainer" containerID="ed7f04a5bf7a47131ede3cac958534ed66f33e1ae426c629f9157f389db06cde" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.811337 4804 scope.go:117] "RemoveContainer" containerID="5707e03ce1413559d6e451944a8178ed7c1374503c523227f07af12a0d1deda1" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.833807 4804 scope.go:117] "RemoveContainer" containerID="e16d35978c1a93f38aec046090d4bb89a7fa37eda37be7158b82151bac67e327" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.850679 4804 scope.go:117] "RemoveContainer" containerID="2af0e585925ef4ba3eb4997ba9a346fe72a20fb7f9f2943dcb04719e80a69278" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.871522 4804 scope.go:117] "RemoveContainer" containerID="4a8cd13cbb3ba23bfa180f42dc167734c03b2d4bcdf0842db5532816b1f0b9bd" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.893509 4804 scope.go:117] "RemoveContainer" containerID="7dbf5f5d88a50f9cfadbbf6692ca887131d2b4df1c33d00e1f7267394ff4525b" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.939560 4804 scope.go:117] "RemoveContainer" containerID="e84b0f31988f4caf559aaf77b9c196ea5e660cca5bf9a529065d3d4f3f6186e1" Feb 17 13:56:30 crc kubenswrapper[4804]: I0217 13:56:30.578193 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:56:30 crc kubenswrapper[4804]: E0217 13:56:30.578800 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:56:42 crc kubenswrapper[4804]: I0217 13:56:42.573689 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:56:42 crc kubenswrapper[4804]: E0217 13:56:42.574417 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:56:54 crc kubenswrapper[4804]: I0217 13:56:54.575599 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:56:54 crc kubenswrapper[4804]: E0217 13:56:54.576920 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:56:56 crc kubenswrapper[4804]: I0217 13:56:56.046164 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-jltn7"] Feb 17 13:56:56 crc kubenswrapper[4804]: I0217 13:56:56.058269 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-jltn7"] Feb 17 13:56:56 crc kubenswrapper[4804]: I0217 13:56:56.221142 4804 generic.go:334] "Generic (PLEG): container finished" podID="5c4e88aa-842f-453a-9ce9-8354c16340e9" containerID="8502d67bffeaa50b847ab11945f925a93eebe7c2984cfa9357e9a9dabde733e2" exitCode=0 Feb 17 13:56:56 crc kubenswrapper[4804]: I0217 13:56:56.221194 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" event={"ID":"5c4e88aa-842f-453a-9ce9-8354c16340e9","Type":"ContainerDied","Data":"8502d67bffeaa50b847ab11945f925a93eebe7c2984cfa9357e9a9dabde733e2"} Feb 17 13:56:56 crc kubenswrapper[4804]: I0217 13:56:56.584853 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f15102ce-82ca-49c8-a069-25469380b043" path="/var/lib/kubelet/pods/f15102ce-82ca-49c8-a069-25469380b043/volumes" Feb 17 13:56:57 crc kubenswrapper[4804]: I0217 13:56:57.627338 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:56:57 crc kubenswrapper[4804]: I0217 13:56:57.758309 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-ssh-key-openstack-edpm-ipam\") pod \"5c4e88aa-842f-453a-9ce9-8354c16340e9\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " Feb 17 13:56:57 crc kubenswrapper[4804]: I0217 13:56:57.758352 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5854r\" (UniqueName: \"kubernetes.io/projected/5c4e88aa-842f-453a-9ce9-8354c16340e9-kube-api-access-5854r\") pod \"5c4e88aa-842f-453a-9ce9-8354c16340e9\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " Feb 17 13:56:57 crc kubenswrapper[4804]: I0217 13:56:57.758504 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-inventory\") pod \"5c4e88aa-842f-453a-9ce9-8354c16340e9\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " Feb 17 13:56:57 crc kubenswrapper[4804]: I0217 13:56:57.767416 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c4e88aa-842f-453a-9ce9-8354c16340e9-kube-api-access-5854r" (OuterVolumeSpecName: "kube-api-access-5854r") pod "5c4e88aa-842f-453a-9ce9-8354c16340e9" (UID: "5c4e88aa-842f-453a-9ce9-8354c16340e9"). InnerVolumeSpecName "kube-api-access-5854r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:56:57 crc kubenswrapper[4804]: I0217 13:56:57.784173 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5c4e88aa-842f-453a-9ce9-8354c16340e9" (UID: "5c4e88aa-842f-453a-9ce9-8354c16340e9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:56:57 crc kubenswrapper[4804]: I0217 13:56:57.808265 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-inventory" (OuterVolumeSpecName: "inventory") pod "5c4e88aa-842f-453a-9ce9-8354c16340e9" (UID: "5c4e88aa-842f-453a-9ce9-8354c16340e9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:56:57 crc kubenswrapper[4804]: I0217 13:56:57.863764 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 13:56:57 crc kubenswrapper[4804]: I0217 13:56:57.863816 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:56:57 crc kubenswrapper[4804]: I0217 13:56:57.863831 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5854r\" (UniqueName: \"kubernetes.io/projected/5c4e88aa-842f-453a-9ce9-8354c16340e9-kube-api-access-5854r\") on node \"crc\" DevicePath \"\"" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.241883 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" event={"ID":"5c4e88aa-842f-453a-9ce9-8354c16340e9","Type":"ContainerDied","Data":"c1ff2f72248d2caefe219eeeda003dc9466862a80496ba52f9eda2e288c07614"} Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.242136 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1ff2f72248d2caefe219eeeda003dc9466862a80496ba52f9eda2e288c07614" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.241965 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.323547 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb"] Feb 17 13:56:58 crc kubenswrapper[4804]: E0217 13:56:58.323995 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59301759-1bac-4d09-97be-b829e799b4d8" containerName="registry-server" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.324021 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="59301759-1bac-4d09-97be-b829e799b4d8" containerName="registry-server" Feb 17 13:56:58 crc kubenswrapper[4804]: E0217 13:56:58.324043 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4e88aa-842f-453a-9ce9-8354c16340e9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.324052 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4e88aa-842f-453a-9ce9-8354c16340e9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 17 13:56:58 crc kubenswrapper[4804]: E0217 13:56:58.324076 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59301759-1bac-4d09-97be-b829e799b4d8" containerName="extract-content" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.324084 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="59301759-1bac-4d09-97be-b829e799b4d8" containerName="extract-content" Feb 17 13:56:58 crc kubenswrapper[4804]: E0217 13:56:58.324096 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59301759-1bac-4d09-97be-b829e799b4d8" containerName="extract-utilities" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.324104 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="59301759-1bac-4d09-97be-b829e799b4d8" containerName="extract-utilities" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.324352 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="59301759-1bac-4d09-97be-b829e799b4d8" containerName="registry-server" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.324386 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c4e88aa-842f-453a-9ce9-8354c16340e9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.325087 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.327920 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.328099 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.328144 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.329151 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.378810 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.378884 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwzk8\" (UniqueName: \"kubernetes.io/projected/ed6642bc-b49f-4e17-a721-b3eae09246aa-kube-api-access-gwzk8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.378931 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.392046 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb"] Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.480556 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.481171 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwzk8\" (UniqueName: \"kubernetes.io/projected/ed6642bc-b49f-4e17-a721-b3eae09246aa-kube-api-access-gwzk8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.481531 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.484369 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.484946 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.495919 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwzk8\" (UniqueName: \"kubernetes.io/projected/ed6642bc-b49f-4e17-a721-b3eae09246aa-kube-api-access-gwzk8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.707383 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:56:59 crc kubenswrapper[4804]: I0217 13:56:59.309615 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb"] Feb 17 13:57:00 crc kubenswrapper[4804]: I0217 13:57:00.264448 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" event={"ID":"ed6642bc-b49f-4e17-a721-b3eae09246aa","Type":"ContainerStarted","Data":"48c424eb76a8a3f09a7eda61042a1fd810e7b2c28cdae1f17671d3c2a494e448"} Feb 17 13:57:00 crc kubenswrapper[4804]: I0217 13:57:00.264797 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" event={"ID":"ed6642bc-b49f-4e17-a721-b3eae09246aa","Type":"ContainerStarted","Data":"f6bc0e9de5cf01fa79d075997918dc6977588636afdac3a50a3625731e798c42"} Feb 17 13:57:00 crc kubenswrapper[4804]: I0217 13:57:00.292518 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" podStartSLOduration=2.113400217 podStartE2EDuration="2.29249222s" podCreationTimestamp="2026-02-17 13:56:58 +0000 UTC" firstStartedPulling="2026-02-17 13:56:59.313923654 +0000 UTC m=+1893.425343001" lastFinishedPulling="2026-02-17 13:56:59.493015667 +0000 UTC m=+1893.604435004" observedRunningTime="2026-02-17 13:57:00.285886217 +0000 UTC m=+1894.397305554" watchObservedRunningTime="2026-02-17 13:57:00.29249222 +0000 UTC m=+1894.403911587" Feb 17 13:57:04 crc kubenswrapper[4804]: I0217 13:57:04.303604 4804 generic.go:334] "Generic (PLEG): container finished" podID="ed6642bc-b49f-4e17-a721-b3eae09246aa" containerID="48c424eb76a8a3f09a7eda61042a1fd810e7b2c28cdae1f17671d3c2a494e448" exitCode=0 Feb 17 13:57:04 crc kubenswrapper[4804]: I0217 13:57:04.303670 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" event={"ID":"ed6642bc-b49f-4e17-a721-b3eae09246aa","Type":"ContainerDied","Data":"48c424eb76a8a3f09a7eda61042a1fd810e7b2c28cdae1f17671d3c2a494e448"} Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.053613 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-7kgzk"] Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.071246 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-xf9m6"] Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.081620 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-xf9m6"] Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.089085 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-7kgzk"] Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.738653 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.832814 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-ssh-key-openstack-edpm-ipam\") pod \"ed6642bc-b49f-4e17-a721-b3eae09246aa\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.833178 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwzk8\" (UniqueName: \"kubernetes.io/projected/ed6642bc-b49f-4e17-a721-b3eae09246aa-kube-api-access-gwzk8\") pod \"ed6642bc-b49f-4e17-a721-b3eae09246aa\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.833501 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-inventory\") pod \"ed6642bc-b49f-4e17-a721-b3eae09246aa\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.838474 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed6642bc-b49f-4e17-a721-b3eae09246aa-kube-api-access-gwzk8" (OuterVolumeSpecName: "kube-api-access-gwzk8") pod "ed6642bc-b49f-4e17-a721-b3eae09246aa" (UID: "ed6642bc-b49f-4e17-a721-b3eae09246aa"). InnerVolumeSpecName "kube-api-access-gwzk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.857769 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ed6642bc-b49f-4e17-a721-b3eae09246aa" (UID: "ed6642bc-b49f-4e17-a721-b3eae09246aa"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.860764 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-inventory" (OuterVolumeSpecName: "inventory") pod "ed6642bc-b49f-4e17-a721-b3eae09246aa" (UID: "ed6642bc-b49f-4e17-a721-b3eae09246aa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.936381 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwzk8\" (UniqueName: \"kubernetes.io/projected/ed6642bc-b49f-4e17-a721-b3eae09246aa-kube-api-access-gwzk8\") on node \"crc\" DevicePath \"\"" Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.936419 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.936430 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.024479 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-jz9x9"] Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.032062 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-jz9x9"] Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.326978 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" event={"ID":"ed6642bc-b49f-4e17-a721-b3eae09246aa","Type":"ContainerDied","Data":"f6bc0e9de5cf01fa79d075997918dc6977588636afdac3a50a3625731e798c42"} Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.327056 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6bc0e9de5cf01fa79d075997918dc6977588636afdac3a50a3625731e798c42" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.327287 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.406616 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm"] Feb 17 13:57:06 crc kubenswrapper[4804]: E0217 13:57:06.409547 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed6642bc-b49f-4e17-a721-b3eae09246aa" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.409601 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed6642bc-b49f-4e17-a721-b3eae09246aa" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.410177 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed6642bc-b49f-4e17-a721-b3eae09246aa" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.411494 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.413268 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.413960 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.414417 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.416096 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.425784 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm"] Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.545989 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2tz2\" (UniqueName: \"kubernetes.io/projected/e9b53a85-8a87-4b65-8832-00c4175da541-kube-api-access-g2tz2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hx4nm\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.546104 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hx4nm\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.546367 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hx4nm\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.587561 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14e1fc7b-0e6c-4377-b4e0-74e77e951b0d" path="/var/lib/kubelet/pods/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d/volumes" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.588341 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19dd0c13-b898-4147-ae5f-cbc5d4915910" path="/var/lib/kubelet/pods/19dd0c13-b898-4147-ae5f-cbc5d4915910/volumes" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.589296 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96609ec5-c9e0-4611-85ff-f7dc474d889a" path="/var/lib/kubelet/pods/96609ec5-c9e0-4611-85ff-f7dc474d889a/volumes" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.649009 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hx4nm\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.650118 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2tz2\" (UniqueName: \"kubernetes.io/projected/e9b53a85-8a87-4b65-8832-00c4175da541-kube-api-access-g2tz2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hx4nm\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.650266 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hx4nm\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.653121 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hx4nm\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.654959 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hx4nm\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.668592 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2tz2\" (UniqueName: \"kubernetes.io/projected/e9b53a85-8a87-4b65-8832-00c4175da541-kube-api-access-g2tz2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hx4nm\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.743964 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:07 crc kubenswrapper[4804]: I0217 13:57:07.270543 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm"] Feb 17 13:57:07 crc kubenswrapper[4804]: I0217 13:57:07.337863 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" event={"ID":"e9b53a85-8a87-4b65-8832-00c4175da541","Type":"ContainerStarted","Data":"5a54e726599954e8f70bf35fe6823e9ba4cce6b5cfacc29c1da8fa06b495654d"} Feb 17 13:57:08 crc kubenswrapper[4804]: I0217 13:57:08.349850 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" event={"ID":"e9b53a85-8a87-4b65-8832-00c4175da541","Type":"ContainerStarted","Data":"906fe7976606f8774bf1bfc4aa2db398a2e4a0a71af3d59f4fd4237e7b6c786c"} Feb 17 13:57:08 crc kubenswrapper[4804]: I0217 13:57:08.370999 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" podStartSLOduration=2.195539309 podStartE2EDuration="2.37097786s" podCreationTimestamp="2026-02-17 13:57:06 +0000 UTC" firstStartedPulling="2026-02-17 13:57:07.283843105 +0000 UTC m=+1901.395262442" lastFinishedPulling="2026-02-17 13:57:07.459281656 +0000 UTC m=+1901.570700993" observedRunningTime="2026-02-17 13:57:08.364902453 +0000 UTC m=+1902.476321810" watchObservedRunningTime="2026-02-17 13:57:08.37097786 +0000 UTC m=+1902.482397207" Feb 17 13:57:08 crc kubenswrapper[4804]: I0217 13:57:08.574802 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:57:08 crc kubenswrapper[4804]: E0217 13:57:08.575118 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:57:21 crc kubenswrapper[4804]: I0217 13:57:21.032118 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-f9zkj"] Feb 17 13:57:21 crc kubenswrapper[4804]: I0217 13:57:21.039983 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-f9zkj"] Feb 17 13:57:22 crc kubenswrapper[4804]: I0217 13:57:22.587566 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02a921c8-6579-451b-beaf-9832cf900668" path="/var/lib/kubelet/pods/02a921c8-6579-451b-beaf-9832cf900668/volumes" Feb 17 13:57:23 crc kubenswrapper[4804]: I0217 13:57:23.575157 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:57:23 crc kubenswrapper[4804]: E0217 13:57:23.576075 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:57:30 crc kubenswrapper[4804]: I0217 13:57:30.238043 4804 scope.go:117] "RemoveContainer" containerID="872cca29de0693ae54523b0b283408b6320b6200ca8ba4e549db427f9a5d561e" Feb 17 13:57:30 crc kubenswrapper[4804]: I0217 13:57:30.276227 4804 scope.go:117] "RemoveContainer" containerID="604b9ee7bde95746f49c889a56552a71b595a4b833acc7e18a46ed3d41181f64" Feb 17 13:57:30 crc kubenswrapper[4804]: I0217 13:57:30.322145 4804 scope.go:117] "RemoveContainer" containerID="ac639ef1a9c58b32b3d0b2c6ada8a7a2aab1ce08a075bd944173f5c820ec7cfc" Feb 17 13:57:30 crc kubenswrapper[4804]: I0217 13:57:30.352725 4804 scope.go:117] "RemoveContainer" containerID="2b0f9e8901b98239ec002ee748081354dc9e4f43d7161d56dae423af6c1770d2" Feb 17 13:57:30 crc kubenswrapper[4804]: I0217 13:57:30.411005 4804 scope.go:117] "RemoveContainer" containerID="4334f8c8c165dce79cf685c7b7ada0d4aa970effa853bf86402b0c64eaa765f2" Feb 17 13:57:37 crc kubenswrapper[4804]: I0217 13:57:37.574096 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:57:37 crc kubenswrapper[4804]: E0217 13:57:37.575200 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:57:41 crc kubenswrapper[4804]: I0217 13:57:41.651525 4804 generic.go:334] "Generic (PLEG): container finished" podID="e9b53a85-8a87-4b65-8832-00c4175da541" containerID="906fe7976606f8774bf1bfc4aa2db398a2e4a0a71af3d59f4fd4237e7b6c786c" exitCode=0 Feb 17 13:57:41 crc kubenswrapper[4804]: I0217 13:57:41.651562 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" event={"ID":"e9b53a85-8a87-4b65-8832-00c4175da541","Type":"ContainerDied","Data":"906fe7976606f8774bf1bfc4aa2db398a2e4a0a71af3d59f4fd4237e7b6c786c"} Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.083984 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.186910 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-ssh-key-openstack-edpm-ipam\") pod \"e9b53a85-8a87-4b65-8832-00c4175da541\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.186996 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-inventory\") pod \"e9b53a85-8a87-4b65-8832-00c4175da541\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.187057 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2tz2\" (UniqueName: \"kubernetes.io/projected/e9b53a85-8a87-4b65-8832-00c4175da541-kube-api-access-g2tz2\") pod \"e9b53a85-8a87-4b65-8832-00c4175da541\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.193003 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9b53a85-8a87-4b65-8832-00c4175da541-kube-api-access-g2tz2" (OuterVolumeSpecName: "kube-api-access-g2tz2") pod "e9b53a85-8a87-4b65-8832-00c4175da541" (UID: "e9b53a85-8a87-4b65-8832-00c4175da541"). InnerVolumeSpecName "kube-api-access-g2tz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:57:43 crc kubenswrapper[4804]: E0217 13:57:43.215639 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-ssh-key-openstack-edpm-ipam podName:e9b53a85-8a87-4b65-8832-00c4175da541 nodeName:}" failed. No retries permitted until 2026-02-17 13:57:43.715601579 +0000 UTC m=+1937.827020916 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key-openstack-edpm-ipam" (UniqueName: "kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-ssh-key-openstack-edpm-ipam") pod "e9b53a85-8a87-4b65-8832-00c4175da541" (UID: "e9b53a85-8a87-4b65-8832-00c4175da541") : error deleting /var/lib/kubelet/pods/e9b53a85-8a87-4b65-8832-00c4175da541/volume-subpaths: remove /var/lib/kubelet/pods/e9b53a85-8a87-4b65-8832-00c4175da541/volume-subpaths: no such file or directory Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.218292 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-inventory" (OuterVolumeSpecName: "inventory") pod "e9b53a85-8a87-4b65-8832-00c4175da541" (UID: "e9b53a85-8a87-4b65-8832-00c4175da541"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.289286 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.289321 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2tz2\" (UniqueName: \"kubernetes.io/projected/e9b53a85-8a87-4b65-8832-00c4175da541-kube-api-access-g2tz2\") on node \"crc\" DevicePath \"\"" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.671417 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" event={"ID":"e9b53a85-8a87-4b65-8832-00c4175da541","Type":"ContainerDied","Data":"5a54e726599954e8f70bf35fe6823e9ba4cce6b5cfacc29c1da8fa06b495654d"} Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.671471 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a54e726599954e8f70bf35fe6823e9ba4cce6b5cfacc29c1da8fa06b495654d" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.671481 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.758588 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq"] Feb 17 13:57:43 crc kubenswrapper[4804]: E0217 13:57:43.758969 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9b53a85-8a87-4b65-8832-00c4175da541" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.758988 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9b53a85-8a87-4b65-8832-00c4175da541" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.759185 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9b53a85-8a87-4b65-8832-00c4175da541" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.759795 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.770396 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq"] Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.809566 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-ssh-key-openstack-edpm-ipam\") pod \"e9b53a85-8a87-4b65-8832-00c4175da541\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.813666 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e9b53a85-8a87-4b65-8832-00c4175da541" (UID: "e9b53a85-8a87-4b65-8832-00c4175da541"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.912850 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.912925 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfqfs\" (UniqueName: \"kubernetes.io/projected/5ca70007-e938-4bd5-9f2a-66f18b87743a-kube-api-access-pfqfs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.913013 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.913078 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:57:44 crc kubenswrapper[4804]: I0217 13:57:44.014824 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:57:44 crc kubenswrapper[4804]: I0217 13:57:44.014900 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfqfs\" (UniqueName: \"kubernetes.io/projected/5ca70007-e938-4bd5-9f2a-66f18b87743a-kube-api-access-pfqfs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:57:44 crc kubenswrapper[4804]: I0217 13:57:44.014961 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:57:44 crc kubenswrapper[4804]: I0217 13:57:44.032098 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:57:44 crc kubenswrapper[4804]: I0217 13:57:44.042068 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:57:44 crc kubenswrapper[4804]: I0217 13:57:44.050132 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfqfs\" (UniqueName: \"kubernetes.io/projected/5ca70007-e938-4bd5-9f2a-66f18b87743a-kube-api-access-pfqfs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:57:44 crc kubenswrapper[4804]: I0217 13:57:44.078892 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:57:44 crc kubenswrapper[4804]: I0217 13:57:44.704120 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq"] Feb 17 13:57:45 crc kubenswrapper[4804]: I0217 13:57:45.702449 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" event={"ID":"5ca70007-e938-4bd5-9f2a-66f18b87743a","Type":"ContainerStarted","Data":"61d3aa7b960b08aa92c348033cb6b61247c79bcc289b0a714c865d0e129fa428"} Feb 17 13:57:45 crc kubenswrapper[4804]: I0217 13:57:45.702829 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" event={"ID":"5ca70007-e938-4bd5-9f2a-66f18b87743a","Type":"ContainerStarted","Data":"3c10d42299865df6258c432ca9ac58a243094430a1994c7efe5d12fe7c99a226"} Feb 17 13:57:45 crc kubenswrapper[4804]: I0217 13:57:45.725940 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" podStartSLOduration=2.541051077 podStartE2EDuration="2.725915367s" podCreationTimestamp="2026-02-17 13:57:43 +0000 UTC" firstStartedPulling="2026-02-17 13:57:44.695766689 +0000 UTC m=+1938.807186026" lastFinishedPulling="2026-02-17 13:57:44.880630979 +0000 UTC m=+1938.992050316" observedRunningTime="2026-02-17 13:57:45.71950206 +0000 UTC m=+1939.830921407" watchObservedRunningTime="2026-02-17 13:57:45.725915367 +0000 UTC m=+1939.837334714" Feb 17 13:57:52 crc kubenswrapper[4804]: I0217 13:57:52.574755 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:57:52 crc kubenswrapper[4804]: E0217 13:57:52.575682 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:58:00 crc kubenswrapper[4804]: I0217 13:58:00.043545 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-582lj"] Feb 17 13:58:00 crc kubenswrapper[4804]: I0217 13:58:00.054848 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-nn6tq"] Feb 17 13:58:00 crc kubenswrapper[4804]: I0217 13:58:00.065049 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-582lj"] Feb 17 13:58:00 crc kubenswrapper[4804]: I0217 13:58:00.072617 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-nn6tq"] Feb 17 13:58:00 crc kubenswrapper[4804]: I0217 13:58:00.587145 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1517f905-d980-43be-8583-f1a40170752e" path="/var/lib/kubelet/pods/1517f905-d980-43be-8583-f1a40170752e/volumes" Feb 17 13:58:00 crc kubenswrapper[4804]: I0217 13:58:00.589028 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fa81aac-8f7a-4947-9fbe-c38851b3652e" path="/var/lib/kubelet/pods/5fa81aac-8f7a-4947-9fbe-c38851b3652e/volumes" Feb 17 13:58:01 crc kubenswrapper[4804]: I0217 13:58:01.060883 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-570c-account-create-update-48hmw"] Feb 17 13:58:01 crc kubenswrapper[4804]: I0217 13:58:01.070528 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-6h6dp"] Feb 17 13:58:01 crc kubenswrapper[4804]: I0217 13:58:01.078894 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-6388-account-create-update-skdjv"] Feb 17 13:58:01 crc kubenswrapper[4804]: I0217 13:58:01.087641 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-2eb5-account-create-update-xv5m7"] Feb 17 13:58:01 crc kubenswrapper[4804]: I0217 13:58:01.097655 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-6388-account-create-update-skdjv"] Feb 17 13:58:01 crc kubenswrapper[4804]: I0217 13:58:01.129551 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-570c-account-create-update-48hmw"] Feb 17 13:58:01 crc kubenswrapper[4804]: I0217 13:58:01.144503 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-2eb5-account-create-update-xv5m7"] Feb 17 13:58:01 crc kubenswrapper[4804]: I0217 13:58:01.156133 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-6h6dp"] Feb 17 13:58:02 crc kubenswrapper[4804]: I0217 13:58:02.583773 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d23eb85-73ab-4049-b6be-486640c922e0" path="/var/lib/kubelet/pods/3d23eb85-73ab-4049-b6be-486640c922e0/volumes" Feb 17 13:58:02 crc kubenswrapper[4804]: I0217 13:58:02.584657 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92d9081e-1e94-4244-b66a-34b05bc98f2d" path="/var/lib/kubelet/pods/92d9081e-1e94-4244-b66a-34b05bc98f2d/volumes" Feb 17 13:58:02 crc kubenswrapper[4804]: I0217 13:58:02.585177 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccb316de-cd6e-4f79-9387-81f7a8add771" path="/var/lib/kubelet/pods/ccb316de-cd6e-4f79-9387-81f7a8add771/volumes" Feb 17 13:58:02 crc kubenswrapper[4804]: I0217 13:58:02.586030 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3c65a30-a890-4d85-80ca-93f9420d5aa4" path="/var/lib/kubelet/pods/f3c65a30-a890-4d85-80ca-93f9420d5aa4/volumes" Feb 17 13:58:03 crc kubenswrapper[4804]: I0217 13:58:03.574835 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:58:03 crc kubenswrapper[4804]: E0217 13:58:03.575299 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:58:16 crc kubenswrapper[4804]: I0217 13:58:16.583874 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:58:16 crc kubenswrapper[4804]: E0217 13:58:16.584676 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:58:28 crc kubenswrapper[4804]: I0217 13:58:28.038210 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ndx9s"] Feb 17 13:58:28 crc kubenswrapper[4804]: I0217 13:58:28.048623 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ndx9s"] Feb 17 13:58:28 crc kubenswrapper[4804]: I0217 13:58:28.575334 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:58:28 crc kubenswrapper[4804]: I0217 13:58:28.600249 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20c077b5-d559-4c19-b8ee-f1b7ebf3fc53" path="/var/lib/kubelet/pods/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53/volumes" Feb 17 13:58:29 crc kubenswrapper[4804]: I0217 13:58:29.094783 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"8c6714459fe07a3c9e4e4659fffd6afce8b955eae8b1de9a8c8da55e663ec16f"} Feb 17 13:58:30 crc kubenswrapper[4804]: I0217 13:58:30.105724 4804 generic.go:334] "Generic (PLEG): container finished" podID="5ca70007-e938-4bd5-9f2a-66f18b87743a" containerID="61d3aa7b960b08aa92c348033cb6b61247c79bcc289b0a714c865d0e129fa428" exitCode=0 Feb 17 13:58:30 crc kubenswrapper[4804]: I0217 13:58:30.105833 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" event={"ID":"5ca70007-e938-4bd5-9f2a-66f18b87743a","Type":"ContainerDied","Data":"61d3aa7b960b08aa92c348033cb6b61247c79bcc289b0a714c865d0e129fa428"} Feb 17 13:58:30 crc kubenswrapper[4804]: I0217 13:58:30.543551 4804 scope.go:117] "RemoveContainer" containerID="75003012d3c522e6a637465c31ac382126c2c3ac2eb1897adb68193823f330ce" Feb 17 13:58:30 crc kubenswrapper[4804]: I0217 13:58:30.566846 4804 scope.go:117] "RemoveContainer" containerID="62fe2cdf4668625c3cfd915d4fddf1e341b2d4a545fb2af5d424708a57a7a4a3" Feb 17 13:58:30 crc kubenswrapper[4804]: I0217 13:58:30.638988 4804 scope.go:117] "RemoveContainer" containerID="04848e079d7c3dd5aec9613ff12ec81fb185688c9c0af0d2f63039d17f192069" Feb 17 13:58:30 crc kubenswrapper[4804]: I0217 13:58:30.680453 4804 scope.go:117] "RemoveContainer" containerID="76d722774285224a6de60017eb8318c4877ef97f9d26d58e45fd8422945c25d0" Feb 17 13:58:30 crc kubenswrapper[4804]: I0217 13:58:30.736402 4804 scope.go:117] "RemoveContainer" containerID="feb7469a90aaa528b89392a82772cfa0640653aa5ae69effdca1ed55e8c2a1de" Feb 17 13:58:30 crc kubenswrapper[4804]: I0217 13:58:30.773556 4804 scope.go:117] "RemoveContainer" containerID="fc70787b15c0217130d5a14ec0e9948f9e8203a3e166dda3f2555ad7e07ed729" Feb 17 13:58:30 crc kubenswrapper[4804]: I0217 13:58:30.835739 4804 scope.go:117] "RemoveContainer" containerID="cd29054fcbff23437aedab7f24e705fc390169a8546254413b976c34b8bd4901" Feb 17 13:58:31 crc kubenswrapper[4804]: I0217 13:58:31.451146 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:58:31 crc kubenswrapper[4804]: I0217 13:58:31.563455 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-inventory\") pod \"5ca70007-e938-4bd5-9f2a-66f18b87743a\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " Feb 17 13:58:31 crc kubenswrapper[4804]: I0217 13:58:31.563817 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-ssh-key-openstack-edpm-ipam\") pod \"5ca70007-e938-4bd5-9f2a-66f18b87743a\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " Feb 17 13:58:31 crc kubenswrapper[4804]: I0217 13:58:31.564018 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfqfs\" (UniqueName: \"kubernetes.io/projected/5ca70007-e938-4bd5-9f2a-66f18b87743a-kube-api-access-pfqfs\") pod \"5ca70007-e938-4bd5-9f2a-66f18b87743a\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " Feb 17 13:58:31 crc kubenswrapper[4804]: I0217 13:58:31.570621 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ca70007-e938-4bd5-9f2a-66f18b87743a-kube-api-access-pfqfs" (OuterVolumeSpecName: "kube-api-access-pfqfs") pod "5ca70007-e938-4bd5-9f2a-66f18b87743a" (UID: "5ca70007-e938-4bd5-9f2a-66f18b87743a"). InnerVolumeSpecName "kube-api-access-pfqfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:58:31 crc kubenswrapper[4804]: I0217 13:58:31.591921 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5ca70007-e938-4bd5-9f2a-66f18b87743a" (UID: "5ca70007-e938-4bd5-9f2a-66f18b87743a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:58:31 crc kubenswrapper[4804]: I0217 13:58:31.604479 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-inventory" (OuterVolumeSpecName: "inventory") pod "5ca70007-e938-4bd5-9f2a-66f18b87743a" (UID: "5ca70007-e938-4bd5-9f2a-66f18b87743a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:58:31 crc kubenswrapper[4804]: I0217 13:58:31.667460 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:58:31 crc kubenswrapper[4804]: I0217 13:58:31.667747 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfqfs\" (UniqueName: \"kubernetes.io/projected/5ca70007-e938-4bd5-9f2a-66f18b87743a-kube-api-access-pfqfs\") on node \"crc\" DevicePath \"\"" Feb 17 13:58:31 crc kubenswrapper[4804]: I0217 13:58:31.667763 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.126544 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" event={"ID":"5ca70007-e938-4bd5-9f2a-66f18b87743a","Type":"ContainerDied","Data":"3c10d42299865df6258c432ca9ac58a243094430a1994c7efe5d12fe7c99a226"} Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.126591 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c10d42299865df6258c432ca9ac58a243094430a1994c7efe5d12fe7c99a226" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.126621 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.207923 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9jrnh"] Feb 17 13:58:32 crc kubenswrapper[4804]: E0217 13:58:32.208427 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ca70007-e938-4bd5-9f2a-66f18b87743a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.208457 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ca70007-e938-4bd5-9f2a-66f18b87743a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.208740 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ca70007-e938-4bd5-9f2a-66f18b87743a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.209500 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.211963 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.211987 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.212163 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.212215 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.222142 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9jrnh"] Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.279835 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9jrnh\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.279898 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9jrnh\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.280119 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6fnv\" (UniqueName: \"kubernetes.io/projected/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-kube-api-access-n6fnv\") pod \"ssh-known-hosts-edpm-deployment-9jrnh\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.380813 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6fnv\" (UniqueName: \"kubernetes.io/projected/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-kube-api-access-n6fnv\") pod \"ssh-known-hosts-edpm-deployment-9jrnh\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.380917 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9jrnh\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.380969 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9jrnh\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.392423 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9jrnh\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.392804 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9jrnh\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.415464 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6fnv\" (UniqueName: \"kubernetes.io/projected/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-kube-api-access-n6fnv\") pod \"ssh-known-hosts-edpm-deployment-9jrnh\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.537185 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:33 crc kubenswrapper[4804]: I0217 13:58:33.070990 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9jrnh"] Feb 17 13:58:33 crc kubenswrapper[4804]: W0217 13:58:33.076495 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdb9b3eb_f3d1_4a32_8a87_b0f686cad260.slice/crio-da27589fd2de45de5953d509dcff69b29d0b85cd5476b598cbbe924ed8624492 WatchSource:0}: Error finding container da27589fd2de45de5953d509dcff69b29d0b85cd5476b598cbbe924ed8624492: Status 404 returned error can't find the container with id da27589fd2de45de5953d509dcff69b29d0b85cd5476b598cbbe924ed8624492 Feb 17 13:58:33 crc kubenswrapper[4804]: I0217 13:58:33.137719 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" event={"ID":"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260","Type":"ContainerStarted","Data":"da27589fd2de45de5953d509dcff69b29d0b85cd5476b598cbbe924ed8624492"} Feb 17 13:58:34 crc kubenswrapper[4804]: I0217 13:58:34.147898 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" event={"ID":"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260","Type":"ContainerStarted","Data":"8b9dc76207e8437c272b7a6756665cd4e57acdc64d44d4d72aea29e92acdf28b"} Feb 17 13:58:34 crc kubenswrapper[4804]: I0217 13:58:34.172107 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" podStartSLOduration=1.988925283 podStartE2EDuration="2.172086983s" podCreationTimestamp="2026-02-17 13:58:32 +0000 UTC" firstStartedPulling="2026-02-17 13:58:33.079829537 +0000 UTC m=+1987.191248874" lastFinishedPulling="2026-02-17 13:58:33.262991197 +0000 UTC m=+1987.374410574" observedRunningTime="2026-02-17 13:58:34.165192787 +0000 UTC m=+1988.276612124" watchObservedRunningTime="2026-02-17 13:58:34.172086983 +0000 UTC m=+1988.283506310" Feb 17 13:58:40 crc kubenswrapper[4804]: I0217 13:58:40.199131 4804 generic.go:334] "Generic (PLEG): container finished" podID="cdb9b3eb-f3d1-4a32-8a87-b0f686cad260" containerID="8b9dc76207e8437c272b7a6756665cd4e57acdc64d44d4d72aea29e92acdf28b" exitCode=0 Feb 17 13:58:40 crc kubenswrapper[4804]: I0217 13:58:40.199227 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" event={"ID":"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260","Type":"ContainerDied","Data":"8b9dc76207e8437c272b7a6756665cd4e57acdc64d44d4d72aea29e92acdf28b"} Feb 17 13:58:41 crc kubenswrapper[4804]: I0217 13:58:41.652697 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:41 crc kubenswrapper[4804]: I0217 13:58:41.763504 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-ssh-key-openstack-edpm-ipam\") pod \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " Feb 17 13:58:41 crc kubenswrapper[4804]: I0217 13:58:41.763572 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-inventory-0\") pod \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " Feb 17 13:58:41 crc kubenswrapper[4804]: I0217 13:58:41.763644 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6fnv\" (UniqueName: \"kubernetes.io/projected/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-kube-api-access-n6fnv\") pod \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " Feb 17 13:58:41 crc kubenswrapper[4804]: I0217 13:58:41.770963 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-kube-api-access-n6fnv" (OuterVolumeSpecName: "kube-api-access-n6fnv") pod "cdb9b3eb-f3d1-4a32-8a87-b0f686cad260" (UID: "cdb9b3eb-f3d1-4a32-8a87-b0f686cad260"). InnerVolumeSpecName "kube-api-access-n6fnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:58:41 crc kubenswrapper[4804]: I0217 13:58:41.792976 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "cdb9b3eb-f3d1-4a32-8a87-b0f686cad260" (UID: "cdb9b3eb-f3d1-4a32-8a87-b0f686cad260"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:58:41 crc kubenswrapper[4804]: I0217 13:58:41.806760 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cdb9b3eb-f3d1-4a32-8a87-b0f686cad260" (UID: "cdb9b3eb-f3d1-4a32-8a87-b0f686cad260"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:58:41 crc kubenswrapper[4804]: I0217 13:58:41.867853 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6fnv\" (UniqueName: \"kubernetes.io/projected/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-kube-api-access-n6fnv\") on node \"crc\" DevicePath \"\"" Feb 17 13:58:41 crc kubenswrapper[4804]: I0217 13:58:41.867889 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:58:41 crc kubenswrapper[4804]: I0217 13:58:41.867902 4804 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.225018 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" event={"ID":"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260","Type":"ContainerDied","Data":"da27589fd2de45de5953d509dcff69b29d0b85cd5476b598cbbe924ed8624492"} Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.225178 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da27589fd2de45de5953d509dcff69b29d0b85cd5476b598cbbe924ed8624492" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.225257 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.292857 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c"] Feb 17 13:58:42 crc kubenswrapper[4804]: E0217 13:58:42.293393 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb9b3eb-f3d1-4a32-8a87-b0f686cad260" containerName="ssh-known-hosts-edpm-deployment" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.293416 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb9b3eb-f3d1-4a32-8a87-b0f686cad260" containerName="ssh-known-hosts-edpm-deployment" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.293693 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdb9b3eb-f3d1-4a32-8a87-b0f686cad260" containerName="ssh-known-hosts-edpm-deployment" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.294479 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.298051 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.298427 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.302628 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.302827 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.309852 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c"] Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.547943 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdt28\" (UniqueName: \"kubernetes.io/projected/01fe0e44-6604-4e17-bcb4-05f202508fc7-kube-api-access-pdt28\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf97c\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.548144 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf97c\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.548245 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf97c\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.650047 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdt28\" (UniqueName: \"kubernetes.io/projected/01fe0e44-6604-4e17-bcb4-05f202508fc7-kube-api-access-pdt28\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf97c\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.650164 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf97c\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.650259 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf97c\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.654067 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf97c\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.663380 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf97c\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.668506 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdt28\" (UniqueName: \"kubernetes.io/projected/01fe0e44-6604-4e17-bcb4-05f202508fc7-kube-api-access-pdt28\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf97c\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.752158 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:43 crc kubenswrapper[4804]: I0217 13:58:43.275170 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c"] Feb 17 13:58:43 crc kubenswrapper[4804]: W0217 13:58:43.277446 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01fe0e44_6604_4e17_bcb4_05f202508fc7.slice/crio-b43806aa1776752507ca661a4f947fe1a0861dacaed6dbab9863a5bf750461cb WatchSource:0}: Error finding container b43806aa1776752507ca661a4f947fe1a0861dacaed6dbab9863a5bf750461cb: Status 404 returned error can't find the container with id b43806aa1776752507ca661a4f947fe1a0861dacaed6dbab9863a5bf750461cb Feb 17 13:58:44 crc kubenswrapper[4804]: I0217 13:58:44.245461 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" event={"ID":"01fe0e44-6604-4e17-bcb4-05f202508fc7","Type":"ContainerStarted","Data":"22cd54d160fc1141491cfd3f3d7de9401f89d305d60f451c7f9ab79a452f96fc"} Feb 17 13:58:44 crc kubenswrapper[4804]: I0217 13:58:44.245759 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" event={"ID":"01fe0e44-6604-4e17-bcb4-05f202508fc7","Type":"ContainerStarted","Data":"b43806aa1776752507ca661a4f947fe1a0861dacaed6dbab9863a5bf750461cb"} Feb 17 13:58:44 crc kubenswrapper[4804]: I0217 13:58:44.273240 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" podStartSLOduration=2.021755222 podStartE2EDuration="2.273195945s" podCreationTimestamp="2026-02-17 13:58:42 +0000 UTC" firstStartedPulling="2026-02-17 13:58:43.279125861 +0000 UTC m=+1997.390545198" lastFinishedPulling="2026-02-17 13:58:43.530566584 +0000 UTC m=+1997.641985921" observedRunningTime="2026-02-17 13:58:44.262781359 +0000 UTC m=+1998.374200716" watchObservedRunningTime="2026-02-17 13:58:44.273195945 +0000 UTC m=+1998.384615282" Feb 17 13:58:47 crc kubenswrapper[4804]: I0217 13:58:47.041945 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-pmp8r"] Feb 17 13:58:47 crc kubenswrapper[4804]: I0217 13:58:47.051828 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-pmp8r"] Feb 17 13:58:48 crc kubenswrapper[4804]: I0217 13:58:48.587591 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6597adc7-fdae-4de0-99bc-87d9807f38f4" path="/var/lib/kubelet/pods/6597adc7-fdae-4de0-99bc-87d9807f38f4/volumes" Feb 17 13:58:51 crc kubenswrapper[4804]: E0217 13:58:51.768703 4804 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01fe0e44_6604_4e17_bcb4_05f202508fc7.slice/crio-conmon-22cd54d160fc1141491cfd3f3d7de9401f89d305d60f451c7f9ab79a452f96fc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01fe0e44_6604_4e17_bcb4_05f202508fc7.slice/crio-22cd54d160fc1141491cfd3f3d7de9401f89d305d60f451c7f9ab79a452f96fc.scope\": RecentStats: unable to find data in memory cache]" Feb 17 13:58:52 crc kubenswrapper[4804]: I0217 13:58:52.034459 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wq5kj"] Feb 17 13:58:52 crc kubenswrapper[4804]: I0217 13:58:52.044061 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wq5kj"] Feb 17 13:58:52 crc kubenswrapper[4804]: I0217 13:58:52.331429 4804 generic.go:334] "Generic (PLEG): container finished" podID="01fe0e44-6604-4e17-bcb4-05f202508fc7" containerID="22cd54d160fc1141491cfd3f3d7de9401f89d305d60f451c7f9ab79a452f96fc" exitCode=0 Feb 17 13:58:52 crc kubenswrapper[4804]: I0217 13:58:52.331483 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" event={"ID":"01fe0e44-6604-4e17-bcb4-05f202508fc7","Type":"ContainerDied","Data":"22cd54d160fc1141491cfd3f3d7de9401f89d305d60f451c7f9ab79a452f96fc"} Feb 17 13:58:52 crc kubenswrapper[4804]: I0217 13:58:52.586586 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c11e165e-2605-470a-a865-230b274ce8d3" path="/var/lib/kubelet/pods/c11e165e-2605-470a-a865-230b274ce8d3/volumes" Feb 17 13:58:53 crc kubenswrapper[4804]: I0217 13:58:53.715610 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:53 crc kubenswrapper[4804]: I0217 13:58:53.769684 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-inventory\") pod \"01fe0e44-6604-4e17-bcb4-05f202508fc7\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " Feb 17 13:58:53 crc kubenswrapper[4804]: I0217 13:58:53.769851 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-ssh-key-openstack-edpm-ipam\") pod \"01fe0e44-6604-4e17-bcb4-05f202508fc7\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " Feb 17 13:58:53 crc kubenswrapper[4804]: I0217 13:58:53.769888 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdt28\" (UniqueName: \"kubernetes.io/projected/01fe0e44-6604-4e17-bcb4-05f202508fc7-kube-api-access-pdt28\") pod \"01fe0e44-6604-4e17-bcb4-05f202508fc7\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " Feb 17 13:58:53 crc kubenswrapper[4804]: I0217 13:58:53.776973 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01fe0e44-6604-4e17-bcb4-05f202508fc7-kube-api-access-pdt28" (OuterVolumeSpecName: "kube-api-access-pdt28") pod "01fe0e44-6604-4e17-bcb4-05f202508fc7" (UID: "01fe0e44-6604-4e17-bcb4-05f202508fc7"). InnerVolumeSpecName "kube-api-access-pdt28". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:58:53 crc kubenswrapper[4804]: I0217 13:58:53.802165 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-inventory" (OuterVolumeSpecName: "inventory") pod "01fe0e44-6604-4e17-bcb4-05f202508fc7" (UID: "01fe0e44-6604-4e17-bcb4-05f202508fc7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:58:53 crc kubenswrapper[4804]: I0217 13:58:53.802788 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "01fe0e44-6604-4e17-bcb4-05f202508fc7" (UID: "01fe0e44-6604-4e17-bcb4-05f202508fc7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:58:53 crc kubenswrapper[4804]: I0217 13:58:53.871155 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:58:53 crc kubenswrapper[4804]: I0217 13:58:53.871209 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdt28\" (UniqueName: \"kubernetes.io/projected/01fe0e44-6604-4e17-bcb4-05f202508fc7-kube-api-access-pdt28\") on node \"crc\" DevicePath \"\"" Feb 17 13:58:53 crc kubenswrapper[4804]: I0217 13:58:53.871221 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.350981 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" event={"ID":"01fe0e44-6604-4e17-bcb4-05f202508fc7","Type":"ContainerDied","Data":"b43806aa1776752507ca661a4f947fe1a0861dacaed6dbab9863a5bf750461cb"} Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.351032 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b43806aa1776752507ca661a4f947fe1a0861dacaed6dbab9863a5bf750461cb" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.351037 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.438724 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66"] Feb 17 13:58:54 crc kubenswrapper[4804]: E0217 13:58:54.439238 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01fe0e44-6604-4e17-bcb4-05f202508fc7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.439256 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="01fe0e44-6604-4e17-bcb4-05f202508fc7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.439480 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="01fe0e44-6604-4e17-bcb4-05f202508fc7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.440067 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.446786 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66"] Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.448115 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.448493 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.448701 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.448839 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.493451 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.493523 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.493764 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s4b7\" (UniqueName: \"kubernetes.io/projected/100d84c5-396c-4772-af09-2e223e72a640-kube-api-access-9s4b7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.595237 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s4b7\" (UniqueName: \"kubernetes.io/projected/100d84c5-396c-4772-af09-2e223e72a640-kube-api-access-9s4b7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.595294 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.595760 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.600225 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.602740 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.616708 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s4b7\" (UniqueName: \"kubernetes.io/projected/100d84c5-396c-4772-af09-2e223e72a640-kube-api-access-9s4b7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.802565 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:58:55 crc kubenswrapper[4804]: I0217 13:58:55.307605 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66"] Feb 17 13:58:55 crc kubenswrapper[4804]: I0217 13:58:55.371474 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" event={"ID":"100d84c5-396c-4772-af09-2e223e72a640","Type":"ContainerStarted","Data":"5b78661fdc285bf6f05049d4a9d9f5cf1f82874131daffa67decbbaa3d1036e7"} Feb 17 13:58:56 crc kubenswrapper[4804]: I0217 13:58:56.384031 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" event={"ID":"100d84c5-396c-4772-af09-2e223e72a640","Type":"ContainerStarted","Data":"36eea98e6310a84677647c1fb3714e8c2a397adf495a6da21f1cabe7f0c0a0b7"} Feb 17 13:58:56 crc kubenswrapper[4804]: I0217 13:58:56.404856 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" podStartSLOduration=2.217252199 podStartE2EDuration="2.404837978s" podCreationTimestamp="2026-02-17 13:58:54 +0000 UTC" firstStartedPulling="2026-02-17 13:58:55.316270018 +0000 UTC m=+2009.427689355" lastFinishedPulling="2026-02-17 13:58:55.503855797 +0000 UTC m=+2009.615275134" observedRunningTime="2026-02-17 13:58:56.403175945 +0000 UTC m=+2010.514595292" watchObservedRunningTime="2026-02-17 13:58:56.404837978 +0000 UTC m=+2010.516257315" Feb 17 13:59:04 crc kubenswrapper[4804]: I0217 13:59:04.456841 4804 generic.go:334] "Generic (PLEG): container finished" podID="100d84c5-396c-4772-af09-2e223e72a640" containerID="36eea98e6310a84677647c1fb3714e8c2a397adf495a6da21f1cabe7f0c0a0b7" exitCode=0 Feb 17 13:59:04 crc kubenswrapper[4804]: I0217 13:59:04.456948 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" event={"ID":"100d84c5-396c-4772-af09-2e223e72a640","Type":"ContainerDied","Data":"36eea98e6310a84677647c1fb3714e8c2a397adf495a6da21f1cabe7f0c0a0b7"} Feb 17 13:59:05 crc kubenswrapper[4804]: I0217 13:59:05.844819 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:59:05 crc kubenswrapper[4804]: I0217 13:59:05.925075 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-inventory\") pod \"100d84c5-396c-4772-af09-2e223e72a640\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " Feb 17 13:59:05 crc kubenswrapper[4804]: I0217 13:59:05.925154 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s4b7\" (UniqueName: \"kubernetes.io/projected/100d84c5-396c-4772-af09-2e223e72a640-kube-api-access-9s4b7\") pod \"100d84c5-396c-4772-af09-2e223e72a640\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " Feb 17 13:59:05 crc kubenswrapper[4804]: I0217 13:59:05.925255 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-ssh-key-openstack-edpm-ipam\") pod \"100d84c5-396c-4772-af09-2e223e72a640\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " Feb 17 13:59:05 crc kubenswrapper[4804]: I0217 13:59:05.931434 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/100d84c5-396c-4772-af09-2e223e72a640-kube-api-access-9s4b7" (OuterVolumeSpecName: "kube-api-access-9s4b7") pod "100d84c5-396c-4772-af09-2e223e72a640" (UID: "100d84c5-396c-4772-af09-2e223e72a640"). InnerVolumeSpecName "kube-api-access-9s4b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:59:05 crc kubenswrapper[4804]: I0217 13:59:05.956335 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "100d84c5-396c-4772-af09-2e223e72a640" (UID: "100d84c5-396c-4772-af09-2e223e72a640"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:59:05 crc kubenswrapper[4804]: I0217 13:59:05.957288 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-inventory" (OuterVolumeSpecName: "inventory") pod "100d84c5-396c-4772-af09-2e223e72a640" (UID: "100d84c5-396c-4772-af09-2e223e72a640"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.028317 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.028397 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.028414 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s4b7\" (UniqueName: \"kubernetes.io/projected/100d84c5-396c-4772-af09-2e223e72a640-kube-api-access-9s4b7\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.476174 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" event={"ID":"100d84c5-396c-4772-af09-2e223e72a640","Type":"ContainerDied","Data":"5b78661fdc285bf6f05049d4a9d9f5cf1f82874131daffa67decbbaa3d1036e7"} Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.476228 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b78661fdc285bf6f05049d4a9d9f5cf1f82874131daffa67decbbaa3d1036e7" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.476264 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.588041 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8"] Feb 17 13:59:06 crc kubenswrapper[4804]: E0217 13:59:06.588715 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="100d84c5-396c-4772-af09-2e223e72a640" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.588732 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="100d84c5-396c-4772-af09-2e223e72a640" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.588935 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="100d84c5-396c-4772-af09-2e223e72a640" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.590435 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8"] Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.590541 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.593496 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.593813 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.594018 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.594280 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.594828 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.595174 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.595371 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.595872 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741066 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741124 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741178 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs62s\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-kube-api-access-fs62s\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741244 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741284 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741322 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741408 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741435 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741455 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741502 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741525 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741559 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741584 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741608 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843435 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843515 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843586 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843620 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843643 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843693 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843725 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843763 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843788 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843811 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843857 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843884 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843911 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs62s\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-kube-api-access-fs62s\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843936 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.848175 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.848838 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.849211 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.850475 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.850700 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.850938 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.851010 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.851093 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.852052 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.852674 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.852710 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.854163 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.855323 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.863461 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs62s\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-kube-api-access-fs62s\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.907431 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:07 crc kubenswrapper[4804]: I0217 13:59:07.417632 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8"] Feb 17 13:59:07 crc kubenswrapper[4804]: I0217 13:59:07.484506 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" event={"ID":"0a55b597-4920-4fa6-99d5-6deaa6f30a4a","Type":"ContainerStarted","Data":"4e7c98968a7dfaeb3b3af000332cc3d28899bc087d1522fecd51ab062f8851da"} Feb 17 13:59:08 crc kubenswrapper[4804]: I0217 13:59:08.492247 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" event={"ID":"0a55b597-4920-4fa6-99d5-6deaa6f30a4a","Type":"ContainerStarted","Data":"b6be46f9b30dbef9a223cc19fe2b815a0349e906da67e5e489219a565cceb442"} Feb 17 13:59:08 crc kubenswrapper[4804]: I0217 13:59:08.515480 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" podStartSLOduration=2.346224595 podStartE2EDuration="2.515459958s" podCreationTimestamp="2026-02-17 13:59:06 +0000 UTC" firstStartedPulling="2026-02-17 13:59:07.419788325 +0000 UTC m=+2021.531207662" lastFinishedPulling="2026-02-17 13:59:07.589023688 +0000 UTC m=+2021.700443025" observedRunningTime="2026-02-17 13:59:08.508301883 +0000 UTC m=+2022.619721220" watchObservedRunningTime="2026-02-17 13:59:08.515459958 +0000 UTC m=+2022.626879285" Feb 17 13:59:30 crc kubenswrapper[4804]: I0217 13:59:30.045272 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-s8qtz"] Feb 17 13:59:30 crc kubenswrapper[4804]: I0217 13:59:30.053047 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-s8qtz"] Feb 17 13:59:30 crc kubenswrapper[4804]: I0217 13:59:30.587108 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b6d06cb-8252-4c27-815b-1f09a217cbb4" path="/var/lib/kubelet/pods/0b6d06cb-8252-4c27-815b-1f09a217cbb4/volumes" Feb 17 13:59:31 crc kubenswrapper[4804]: I0217 13:59:31.029852 4804 scope.go:117] "RemoveContainer" containerID="24aef71ff922a8ddea4d7c3429161120ea76c5281b5a5f51b9b913d40e9cb137" Feb 17 13:59:31 crc kubenswrapper[4804]: I0217 13:59:31.067038 4804 scope.go:117] "RemoveContainer" containerID="29efb5e0a9decba15d04c2ad76b8438da8424bb8f92bf46c981df4cb056e18f6" Feb 17 13:59:31 crc kubenswrapper[4804]: I0217 13:59:31.123398 4804 scope.go:117] "RemoveContainer" containerID="3edaad49062f52adf5c7194a9baff45d7b6f8571728650127dd710028add6529" Feb 17 13:59:43 crc kubenswrapper[4804]: I0217 13:59:43.811376 4804 generic.go:334] "Generic (PLEG): container finished" podID="0a55b597-4920-4fa6-99d5-6deaa6f30a4a" containerID="b6be46f9b30dbef9a223cc19fe2b815a0349e906da67e5e489219a565cceb442" exitCode=0 Feb 17 13:59:43 crc kubenswrapper[4804]: I0217 13:59:43.811493 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" event={"ID":"0a55b597-4920-4fa6-99d5-6deaa6f30a4a","Type":"ContainerDied","Data":"b6be46f9b30dbef9a223cc19fe2b815a0349e906da67e5e489219a565cceb442"} Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.338279 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.469327 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-neutron-metadata-combined-ca-bundle\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.469420 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.469445 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-repo-setup-combined-ca-bundle\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.469468 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.469489 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.469512 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-bootstrap-combined-ca-bundle\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.469536 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-nova-combined-ca-bundle\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.469625 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ovn-combined-ca-bundle\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.469781 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs62s\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-kube-api-access-fs62s\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.469901 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-inventory\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.470090 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-telemetry-combined-ca-bundle\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.470125 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-libvirt-combined-ca-bundle\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.470487 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ssh-key-openstack-edpm-ipam\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.470568 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.476681 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.481661 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.481804 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.483108 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.483145 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.483156 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.483179 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.483188 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.483262 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.483359 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.483418 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-kube-api-access-fs62s" (OuterVolumeSpecName: "kube-api-access-fs62s") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "kube-api-access-fs62s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.486793 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.505064 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-inventory" (OuterVolumeSpecName: "inventory") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.514936 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.572931 4804 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.572995 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573010 4804 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573027 4804 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573041 4804 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573054 4804 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573070 4804 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573083 4804 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573097 4804 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573111 4804 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573125 4804 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573136 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs62s\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-kube-api-access-fs62s\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573147 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573157 4804 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.845715 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" event={"ID":"0a55b597-4920-4fa6-99d5-6deaa6f30a4a","Type":"ContainerDied","Data":"4e7c98968a7dfaeb3b3af000332cc3d28899bc087d1522fecd51ab062f8851da"} Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.845767 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e7c98968a7dfaeb3b3af000332cc3d28899bc087d1522fecd51ab062f8851da" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.845775 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.448144 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m"] Feb 17 13:59:47 crc kubenswrapper[4804]: E0217 13:59:47.448541 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a55b597-4920-4fa6-99d5-6deaa6f30a4a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.448554 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a55b597-4920-4fa6-99d5-6deaa6f30a4a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.448738 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a55b597-4920-4fa6-99d5-6deaa6f30a4a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.449314 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.451453 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.451497 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.451777 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.451889 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.458929 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.466410 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m"] Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.560433 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.561045 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.561188 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.561486 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/be98213b-0510-4f69-9d98-81363c04d8bd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.561636 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz2qv\" (UniqueName: \"kubernetes.io/projected/be98213b-0510-4f69-9d98-81363c04d8bd-kube-api-access-tz2qv\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.663480 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/be98213b-0510-4f69-9d98-81363c04d8bd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.663588 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz2qv\" (UniqueName: \"kubernetes.io/projected/be98213b-0510-4f69-9d98-81363c04d8bd-kube-api-access-tz2qv\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.663701 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.663743 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.663775 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.664846 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/be98213b-0510-4f69-9d98-81363c04d8bd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.669621 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.669762 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.670142 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.679489 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz2qv\" (UniqueName: \"kubernetes.io/projected/be98213b-0510-4f69-9d98-81363c04d8bd-kube-api-access-tz2qv\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.766531 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:48 crc kubenswrapper[4804]: I0217 13:59:48.375519 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m"] Feb 17 13:59:48 crc kubenswrapper[4804]: I0217 13:59:48.973876 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" event={"ID":"be98213b-0510-4f69-9d98-81363c04d8bd","Type":"ContainerStarted","Data":"8e3443e0a50b60470ef93ad0d7e6c63fd03c0873cfe2fa3786abfaf905be2422"} Feb 17 13:59:49 crc kubenswrapper[4804]: I0217 13:59:49.985242 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" event={"ID":"be98213b-0510-4f69-9d98-81363c04d8bd","Type":"ContainerStarted","Data":"12c7596730ce7431db3737b621eacbf5768ff35bc98e48dfcb2ddd7465e4e588"} Feb 17 13:59:50 crc kubenswrapper[4804]: I0217 13:59:50.090023 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" podStartSLOduration=2.898249229 podStartE2EDuration="3.089999888s" podCreationTimestamp="2026-02-17 13:59:47 +0000 UTC" firstStartedPulling="2026-02-17 13:59:48.376602804 +0000 UTC m=+2062.488022151" lastFinishedPulling="2026-02-17 13:59:48.568353473 +0000 UTC m=+2062.679772810" observedRunningTime="2026-02-17 13:59:50.080362835 +0000 UTC m=+2064.191782172" watchObservedRunningTime="2026-02-17 13:59:50.089999888 +0000 UTC m=+2064.201419235" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.142094 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc"] Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.144649 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.150608 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.150817 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.156349 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc"] Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.280384 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5860044-8a05-47fc-848e-fe988543fbe6-secret-volume\") pod \"collect-profiles-29522280-9z9gc\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.280553 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5860044-8a05-47fc-848e-fe988543fbe6-config-volume\") pod \"collect-profiles-29522280-9z9gc\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.280807 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmnwv\" (UniqueName: \"kubernetes.io/projected/e5860044-8a05-47fc-848e-fe988543fbe6-kube-api-access-xmnwv\") pod \"collect-profiles-29522280-9z9gc\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.383125 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5860044-8a05-47fc-848e-fe988543fbe6-config-volume\") pod \"collect-profiles-29522280-9z9gc\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.383413 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmnwv\" (UniqueName: \"kubernetes.io/projected/e5860044-8a05-47fc-848e-fe988543fbe6-kube-api-access-xmnwv\") pod \"collect-profiles-29522280-9z9gc\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.383598 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5860044-8a05-47fc-848e-fe988543fbe6-secret-volume\") pod \"collect-profiles-29522280-9z9gc\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.385740 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5860044-8a05-47fc-848e-fe988543fbe6-config-volume\") pod \"collect-profiles-29522280-9z9gc\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.390188 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5860044-8a05-47fc-848e-fe988543fbe6-secret-volume\") pod \"collect-profiles-29522280-9z9gc\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.410398 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmnwv\" (UniqueName: \"kubernetes.io/projected/e5860044-8a05-47fc-848e-fe988543fbe6-kube-api-access-xmnwv\") pod \"collect-profiles-29522280-9z9gc\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.479750 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.938861 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc"] Feb 17 14:00:01 crc kubenswrapper[4804]: I0217 14:00:01.081569 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" event={"ID":"e5860044-8a05-47fc-848e-fe988543fbe6","Type":"ContainerStarted","Data":"3085cbf7ec604fa269706abe77a1c4626eb24d5b8d18de0183c58e390762011f"} Feb 17 14:00:02 crc kubenswrapper[4804]: I0217 14:00:02.098281 4804 generic.go:334] "Generic (PLEG): container finished" podID="e5860044-8a05-47fc-848e-fe988543fbe6" containerID="65a25413b82f8ebba6f197969e84840573a778c78b72f9b376f4a3d6b1b0329b" exitCode=0 Feb 17 14:00:02 crc kubenswrapper[4804]: I0217 14:00:02.098378 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" event={"ID":"e5860044-8a05-47fc-848e-fe988543fbe6","Type":"ContainerDied","Data":"65a25413b82f8ebba6f197969e84840573a778c78b72f9b376f4a3d6b1b0329b"} Feb 17 14:00:03 crc kubenswrapper[4804]: I0217 14:00:03.465914 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:03 crc kubenswrapper[4804]: I0217 14:00:03.552984 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmnwv\" (UniqueName: \"kubernetes.io/projected/e5860044-8a05-47fc-848e-fe988543fbe6-kube-api-access-xmnwv\") pod \"e5860044-8a05-47fc-848e-fe988543fbe6\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " Feb 17 14:00:03 crc kubenswrapper[4804]: I0217 14:00:03.553090 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5860044-8a05-47fc-848e-fe988543fbe6-secret-volume\") pod \"e5860044-8a05-47fc-848e-fe988543fbe6\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " Feb 17 14:00:03 crc kubenswrapper[4804]: I0217 14:00:03.553252 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5860044-8a05-47fc-848e-fe988543fbe6-config-volume\") pod \"e5860044-8a05-47fc-848e-fe988543fbe6\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " Feb 17 14:00:03 crc kubenswrapper[4804]: I0217 14:00:03.554231 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5860044-8a05-47fc-848e-fe988543fbe6-config-volume" (OuterVolumeSpecName: "config-volume") pod "e5860044-8a05-47fc-848e-fe988543fbe6" (UID: "e5860044-8a05-47fc-848e-fe988543fbe6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:00:03 crc kubenswrapper[4804]: I0217 14:00:03.558862 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5860044-8a05-47fc-848e-fe988543fbe6-kube-api-access-xmnwv" (OuterVolumeSpecName: "kube-api-access-xmnwv") pod "e5860044-8a05-47fc-848e-fe988543fbe6" (UID: "e5860044-8a05-47fc-848e-fe988543fbe6"). InnerVolumeSpecName "kube-api-access-xmnwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:00:03 crc kubenswrapper[4804]: I0217 14:00:03.563983 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5860044-8a05-47fc-848e-fe988543fbe6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e5860044-8a05-47fc-848e-fe988543fbe6" (UID: "e5860044-8a05-47fc-848e-fe988543fbe6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:00:03 crc kubenswrapper[4804]: I0217 14:00:03.655979 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmnwv\" (UniqueName: \"kubernetes.io/projected/e5860044-8a05-47fc-848e-fe988543fbe6-kube-api-access-xmnwv\") on node \"crc\" DevicePath \"\"" Feb 17 14:00:03 crc kubenswrapper[4804]: I0217 14:00:03.656017 4804 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5860044-8a05-47fc-848e-fe988543fbe6-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:00:03 crc kubenswrapper[4804]: I0217 14:00:03.656026 4804 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5860044-8a05-47fc-848e-fe988543fbe6-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:00:04 crc kubenswrapper[4804]: I0217 14:00:04.115420 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" event={"ID":"e5860044-8a05-47fc-848e-fe988543fbe6","Type":"ContainerDied","Data":"3085cbf7ec604fa269706abe77a1c4626eb24d5b8d18de0183c58e390762011f"} Feb 17 14:00:04 crc kubenswrapper[4804]: I0217 14:00:04.115453 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:04 crc kubenswrapper[4804]: I0217 14:00:04.115463 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3085cbf7ec604fa269706abe77a1c4626eb24d5b8d18de0183c58e390762011f" Feb 17 14:00:04 crc kubenswrapper[4804]: I0217 14:00:04.545476 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8"] Feb 17 14:00:04 crc kubenswrapper[4804]: I0217 14:00:04.556428 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8"] Feb 17 14:00:04 crc kubenswrapper[4804]: I0217 14:00:04.584825 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3768c453-c58d-4768-9620-a202cbb8ccd8" path="/var/lib/kubelet/pods/3768c453-c58d-4768-9620-a202cbb8ccd8/volumes" Feb 17 14:00:31 crc kubenswrapper[4804]: I0217 14:00:31.240288 4804 scope.go:117] "RemoveContainer" containerID="4162bfeb135a23379531aee533539dfb67782c33b11814b5fe4b4ead4443c227" Feb 17 14:00:44 crc kubenswrapper[4804]: I0217 14:00:44.474994 4804 generic.go:334] "Generic (PLEG): container finished" podID="be98213b-0510-4f69-9d98-81363c04d8bd" containerID="12c7596730ce7431db3737b621eacbf5768ff35bc98e48dfcb2ddd7465e4e588" exitCode=0 Feb 17 14:00:44 crc kubenswrapper[4804]: I0217 14:00:44.475075 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" event={"ID":"be98213b-0510-4f69-9d98-81363c04d8bd","Type":"ContainerDied","Data":"12c7596730ce7431db3737b621eacbf5768ff35bc98e48dfcb2ddd7465e4e588"} Feb 17 14:00:45 crc kubenswrapper[4804]: I0217 14:00:45.891860 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.094043 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/be98213b-0510-4f69-9d98-81363c04d8bd-ovncontroller-config-0\") pod \"be98213b-0510-4f69-9d98-81363c04d8bd\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.094429 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ssh-key-openstack-edpm-ipam\") pod \"be98213b-0510-4f69-9d98-81363c04d8bd\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.094629 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ovn-combined-ca-bundle\") pod \"be98213b-0510-4f69-9d98-81363c04d8bd\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.094862 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz2qv\" (UniqueName: \"kubernetes.io/projected/be98213b-0510-4f69-9d98-81363c04d8bd-kube-api-access-tz2qv\") pod \"be98213b-0510-4f69-9d98-81363c04d8bd\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.095249 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-inventory\") pod \"be98213b-0510-4f69-9d98-81363c04d8bd\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.100918 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "be98213b-0510-4f69-9d98-81363c04d8bd" (UID: "be98213b-0510-4f69-9d98-81363c04d8bd"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.104450 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be98213b-0510-4f69-9d98-81363c04d8bd-kube-api-access-tz2qv" (OuterVolumeSpecName: "kube-api-access-tz2qv") pod "be98213b-0510-4f69-9d98-81363c04d8bd" (UID: "be98213b-0510-4f69-9d98-81363c04d8bd"). InnerVolumeSpecName "kube-api-access-tz2qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.127395 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be98213b-0510-4f69-9d98-81363c04d8bd-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "be98213b-0510-4f69-9d98-81363c04d8bd" (UID: "be98213b-0510-4f69-9d98-81363c04d8bd"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.136799 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-inventory" (OuterVolumeSpecName: "inventory") pod "be98213b-0510-4f69-9d98-81363c04d8bd" (UID: "be98213b-0510-4f69-9d98-81363c04d8bd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.157088 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "be98213b-0510-4f69-9d98-81363c04d8bd" (UID: "be98213b-0510-4f69-9d98-81363c04d8bd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.198348 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.198404 4804 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/be98213b-0510-4f69-9d98-81363c04d8bd-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.198420 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.198432 4804 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.198444 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz2qv\" (UniqueName: \"kubernetes.io/projected/be98213b-0510-4f69-9d98-81363c04d8bd-kube-api-access-tz2qv\") on node \"crc\" DevicePath \"\"" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.499644 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" event={"ID":"be98213b-0510-4f69-9d98-81363c04d8bd","Type":"ContainerDied","Data":"8e3443e0a50b60470ef93ad0d7e6c63fd03c0873cfe2fa3786abfaf905be2422"} Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.499893 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e3443e0a50b60470ef93ad0d7e6c63fd03c0873cfe2fa3786abfaf905be2422" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.500985 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.624452 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg"] Feb 17 14:00:46 crc kubenswrapper[4804]: E0217 14:00:46.624905 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5860044-8a05-47fc-848e-fe988543fbe6" containerName="collect-profiles" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.624926 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5860044-8a05-47fc-848e-fe988543fbe6" containerName="collect-profiles" Feb 17 14:00:46 crc kubenswrapper[4804]: E0217 14:00:46.624952 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be98213b-0510-4f69-9d98-81363c04d8bd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.624959 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="be98213b-0510-4f69-9d98-81363c04d8bd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.625151 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5860044-8a05-47fc-848e-fe988543fbe6" containerName="collect-profiles" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.625182 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="be98213b-0510-4f69-9d98-81363c04d8bd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.625874 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.636959 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.636983 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.637205 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.637279 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.639710 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.639920 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.673997 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg"] Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.709181 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.709243 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.709270 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.709302 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.709337 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.709355 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fbwf\" (UniqueName: \"kubernetes.io/projected/84938cd5-694c-423a-a0d1-801f28085377-kube-api-access-8fbwf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.811172 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.811242 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.811278 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.811322 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.811368 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.811395 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fbwf\" (UniqueName: \"kubernetes.io/projected/84938cd5-694c-423a-a0d1-801f28085377-kube-api-access-8fbwf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.822986 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.824897 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.827745 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.839896 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.855050 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.860068 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fbwf\" (UniqueName: \"kubernetes.io/projected/84938cd5-694c-423a-a0d1-801f28085377-kube-api-access-8fbwf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.953693 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:47 crc kubenswrapper[4804]: W0217 14:00:47.557729 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84938cd5_694c_423a_a0d1_801f28085377.slice/crio-fa1090751bbd8922101933ec4bbfcd3f26a54132de8e8a491da47a32bd612c8d WatchSource:0}: Error finding container fa1090751bbd8922101933ec4bbfcd3f26a54132de8e8a491da47a32bd612c8d: Status 404 returned error can't find the container with id fa1090751bbd8922101933ec4bbfcd3f26a54132de8e8a491da47a32bd612c8d Feb 17 14:00:47 crc kubenswrapper[4804]: I0217 14:00:47.565400 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg"] Feb 17 14:00:48 crc kubenswrapper[4804]: I0217 14:00:48.517287 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" event={"ID":"84938cd5-694c-423a-a0d1-801f28085377","Type":"ContainerStarted","Data":"dca45023c7c97a7a87b586cb70d296ad9987cc9764180a5e3b59f9fa0e2be83c"} Feb 17 14:00:48 crc kubenswrapper[4804]: I0217 14:00:48.517631 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" event={"ID":"84938cd5-694c-423a-a0d1-801f28085377","Type":"ContainerStarted","Data":"fa1090751bbd8922101933ec4bbfcd3f26a54132de8e8a491da47a32bd612c8d"} Feb 17 14:00:48 crc kubenswrapper[4804]: I0217 14:00:48.543412 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" podStartSLOduration=2.332133852 podStartE2EDuration="2.543387757s" podCreationTimestamp="2026-02-17 14:00:46 +0000 UTC" firstStartedPulling="2026-02-17 14:00:47.561411365 +0000 UTC m=+2121.672830702" lastFinishedPulling="2026-02-17 14:00:47.77266527 +0000 UTC m=+2121.884084607" observedRunningTime="2026-02-17 14:00:48.537168302 +0000 UTC m=+2122.648587649" watchObservedRunningTime="2026-02-17 14:00:48.543387757 +0000 UTC m=+2122.654807094" Feb 17 14:00:55 crc kubenswrapper[4804]: I0217 14:00:55.835108 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:00:55 crc kubenswrapper[4804]: I0217 14:00:55.835571 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.140518 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29522281-k9ptv"] Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.143474 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.175505 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29522281-k9ptv"] Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.281785 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-config-data\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.281822 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qbqj\" (UniqueName: \"kubernetes.io/projected/c2d1f319-5d08-4969-a968-45eba20958a7-kube-api-access-4qbqj\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.281932 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-combined-ca-bundle\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.281970 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-fernet-keys\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.383634 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-combined-ca-bundle\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.383754 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-fernet-keys\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.383848 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-config-data\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.383882 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qbqj\" (UniqueName: \"kubernetes.io/projected/c2d1f319-5d08-4969-a968-45eba20958a7-kube-api-access-4qbqj\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.390040 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-combined-ca-bundle\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.390748 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-config-data\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.391435 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-fernet-keys\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.402360 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qbqj\" (UniqueName: \"kubernetes.io/projected/c2d1f319-5d08-4969-a968-45eba20958a7-kube-api-access-4qbqj\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.484577 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:01 crc kubenswrapper[4804]: I0217 14:01:01.003354 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29522281-k9ptv"] Feb 17 14:01:01 crc kubenswrapper[4804]: I0217 14:01:01.644108 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29522281-k9ptv" event={"ID":"c2d1f319-5d08-4969-a968-45eba20958a7","Type":"ContainerStarted","Data":"deed3aaa8676f3c0d8f2143f71bdec1c0dc234dca7c6bcf52a241acbff2f9e66"} Feb 17 14:01:01 crc kubenswrapper[4804]: I0217 14:01:01.644418 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29522281-k9ptv" event={"ID":"c2d1f319-5d08-4969-a968-45eba20958a7","Type":"ContainerStarted","Data":"b10580fc3079f4e1ccce270ed3c03f975f755bd6366f63cb48dee4c22f68f194"} Feb 17 14:01:01 crc kubenswrapper[4804]: I0217 14:01:01.667820 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29522281-k9ptv" podStartSLOduration=1.667800275 podStartE2EDuration="1.667800275s" podCreationTimestamp="2026-02-17 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:01:01.65870319 +0000 UTC m=+2135.770122537" watchObservedRunningTime="2026-02-17 14:01:01.667800275 +0000 UTC m=+2135.779219612" Feb 17 14:01:03 crc kubenswrapper[4804]: I0217 14:01:03.665825 4804 generic.go:334] "Generic (PLEG): container finished" podID="c2d1f319-5d08-4969-a968-45eba20958a7" containerID="deed3aaa8676f3c0d8f2143f71bdec1c0dc234dca7c6bcf52a241acbff2f9e66" exitCode=0 Feb 17 14:01:03 crc kubenswrapper[4804]: I0217 14:01:03.665916 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29522281-k9ptv" event={"ID":"c2d1f319-5d08-4969-a968-45eba20958a7","Type":"ContainerDied","Data":"deed3aaa8676f3c0d8f2143f71bdec1c0dc234dca7c6bcf52a241acbff2f9e66"} Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.025484 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.176536 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-combined-ca-bundle\") pod \"c2d1f319-5d08-4969-a968-45eba20958a7\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.176791 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-fernet-keys\") pod \"c2d1f319-5d08-4969-a968-45eba20958a7\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.176976 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qbqj\" (UniqueName: \"kubernetes.io/projected/c2d1f319-5d08-4969-a968-45eba20958a7-kube-api-access-4qbqj\") pod \"c2d1f319-5d08-4969-a968-45eba20958a7\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.177058 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-config-data\") pod \"c2d1f319-5d08-4969-a968-45eba20958a7\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.181959 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2d1f319-5d08-4969-a968-45eba20958a7-kube-api-access-4qbqj" (OuterVolumeSpecName: "kube-api-access-4qbqj") pod "c2d1f319-5d08-4969-a968-45eba20958a7" (UID: "c2d1f319-5d08-4969-a968-45eba20958a7"). InnerVolumeSpecName "kube-api-access-4qbqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.182395 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c2d1f319-5d08-4969-a968-45eba20958a7" (UID: "c2d1f319-5d08-4969-a968-45eba20958a7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.205570 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2d1f319-5d08-4969-a968-45eba20958a7" (UID: "c2d1f319-5d08-4969-a968-45eba20958a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.236392 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-config-data" (OuterVolumeSpecName: "config-data") pod "c2d1f319-5d08-4969-a968-45eba20958a7" (UID: "c2d1f319-5d08-4969-a968-45eba20958a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.279676 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.279714 4804 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.279723 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qbqj\" (UniqueName: \"kubernetes.io/projected/c2d1f319-5d08-4969-a968-45eba20958a7-kube-api-access-4qbqj\") on node \"crc\" DevicePath \"\"" Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.279734 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.682991 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29522281-k9ptv" event={"ID":"c2d1f319-5d08-4969-a968-45eba20958a7","Type":"ContainerDied","Data":"b10580fc3079f4e1ccce270ed3c03f975f755bd6366f63cb48dee4c22f68f194"} Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.683425 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b10580fc3079f4e1ccce270ed3c03f975f755bd6366f63cb48dee4c22f68f194" Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.683487 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:25 crc kubenswrapper[4804]: I0217 14:01:25.835275 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:01:25 crc kubenswrapper[4804]: I0217 14:01:25.835883 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:01:31 crc kubenswrapper[4804]: I0217 14:01:31.961143 4804 generic.go:334] "Generic (PLEG): container finished" podID="84938cd5-694c-423a-a0d1-801f28085377" containerID="dca45023c7c97a7a87b586cb70d296ad9987cc9764180a5e3b59f9fa0e2be83c" exitCode=0 Feb 17 14:01:31 crc kubenswrapper[4804]: I0217 14:01:31.961274 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" event={"ID":"84938cd5-694c-423a-a0d1-801f28085377","Type":"ContainerDied","Data":"dca45023c7c97a7a87b586cb70d296ad9987cc9764180a5e3b59f9fa0e2be83c"} Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.428292 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.479168 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-ovn-metadata-agent-neutron-config-0\") pod \"84938cd5-694c-423a-a0d1-801f28085377\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.479236 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-ssh-key-openstack-edpm-ipam\") pod \"84938cd5-694c-423a-a0d1-801f28085377\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.479287 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-nova-metadata-neutron-config-0\") pod \"84938cd5-694c-423a-a0d1-801f28085377\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.479350 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-metadata-combined-ca-bundle\") pod \"84938cd5-694c-423a-a0d1-801f28085377\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.479373 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fbwf\" (UniqueName: \"kubernetes.io/projected/84938cd5-694c-423a-a0d1-801f28085377-kube-api-access-8fbwf\") pod \"84938cd5-694c-423a-a0d1-801f28085377\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.479530 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-inventory\") pod \"84938cd5-694c-423a-a0d1-801f28085377\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.487223 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84938cd5-694c-423a-a0d1-801f28085377-kube-api-access-8fbwf" (OuterVolumeSpecName: "kube-api-access-8fbwf") pod "84938cd5-694c-423a-a0d1-801f28085377" (UID: "84938cd5-694c-423a-a0d1-801f28085377"). InnerVolumeSpecName "kube-api-access-8fbwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.490678 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "84938cd5-694c-423a-a0d1-801f28085377" (UID: "84938cd5-694c-423a-a0d1-801f28085377"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.507732 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "84938cd5-694c-423a-a0d1-801f28085377" (UID: "84938cd5-694c-423a-a0d1-801f28085377"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.511659 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-inventory" (OuterVolumeSpecName: "inventory") pod "84938cd5-694c-423a-a0d1-801f28085377" (UID: "84938cd5-694c-423a-a0d1-801f28085377"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.521189 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "84938cd5-694c-423a-a0d1-801f28085377" (UID: "84938cd5-694c-423a-a0d1-801f28085377"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.525781 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "84938cd5-694c-423a-a0d1-801f28085377" (UID: "84938cd5-694c-423a-a0d1-801f28085377"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.582055 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.582113 4804 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.582130 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.582146 4804 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.582160 4804 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.582174 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fbwf\" (UniqueName: \"kubernetes.io/projected/84938cd5-694c-423a-a0d1-801f28085377-kube-api-access-8fbwf\") on node \"crc\" DevicePath \"\"" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.982181 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" event={"ID":"84938cd5-694c-423a-a0d1-801f28085377","Type":"ContainerDied","Data":"fa1090751bbd8922101933ec4bbfcd3f26a54132de8e8a491da47a32bd612c8d"} Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.982236 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.982243 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa1090751bbd8922101933ec4bbfcd3f26a54132de8e8a491da47a32bd612c8d" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.081355 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc"] Feb 17 14:01:34 crc kubenswrapper[4804]: E0217 14:01:34.081876 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d1f319-5d08-4969-a968-45eba20958a7" containerName="keystone-cron" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.081898 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d1f319-5d08-4969-a968-45eba20958a7" containerName="keystone-cron" Feb 17 14:01:34 crc kubenswrapper[4804]: E0217 14:01:34.081933 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84938cd5-694c-423a-a0d1-801f28085377" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.081946 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="84938cd5-694c-423a-a0d1-801f28085377" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.082188 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="84938cd5-694c-423a-a0d1-801f28085377" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.082228 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2d1f319-5d08-4969-a968-45eba20958a7" containerName="keystone-cron" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.083064 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.085327 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.085940 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.178886 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.179256 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.179721 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.191159 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc"] Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.285345 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bzfk\" (UniqueName: \"kubernetes.io/projected/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-kube-api-access-6bzfk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.285408 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.285452 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.285704 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.286065 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.387923 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.388308 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bzfk\" (UniqueName: \"kubernetes.io/projected/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-kube-api-access-6bzfk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.388345 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.388379 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.388429 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.392408 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.393479 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.394327 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.410396 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.414912 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bzfk\" (UniqueName: \"kubernetes.io/projected/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-kube-api-access-6bzfk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.511240 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:35 crc kubenswrapper[4804]: I0217 14:01:35.006471 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc"] Feb 17 14:01:35 crc kubenswrapper[4804]: I0217 14:01:35.011066 4804 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:01:36 crc kubenswrapper[4804]: I0217 14:01:36.001464 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" event={"ID":"c0aad2ba-98cf-42b5-9c03-40633fb8ac18","Type":"ContainerStarted","Data":"8ce565034da923c62aa35b8a82d937d994fde79d28a308a124ad2648ce45eeca"} Feb 17 14:01:36 crc kubenswrapper[4804]: I0217 14:01:36.003537 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" event={"ID":"c0aad2ba-98cf-42b5-9c03-40633fb8ac18","Type":"ContainerStarted","Data":"b85edd5f6c172e3fc7186590e69abc45c66e58877217d9c682e3a1e6773d16ec"} Feb 17 14:01:50 crc kubenswrapper[4804]: I0217 14:01:50.852558 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" podStartSLOduration=16.679926228 podStartE2EDuration="16.852535604s" podCreationTimestamp="2026-02-17 14:01:34 +0000 UTC" firstStartedPulling="2026-02-17 14:01:35.01087148 +0000 UTC m=+2169.122290817" lastFinishedPulling="2026-02-17 14:01:35.183480856 +0000 UTC m=+2169.294900193" observedRunningTime="2026-02-17 14:01:36.023976287 +0000 UTC m=+2170.135395634" watchObservedRunningTime="2026-02-17 14:01:50.852535604 +0000 UTC m=+2184.963954941" Feb 17 14:01:50 crc kubenswrapper[4804]: I0217 14:01:50.859081 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-24vs5"] Feb 17 14:01:50 crc kubenswrapper[4804]: I0217 14:01:50.861288 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:01:50 crc kubenswrapper[4804]: I0217 14:01:50.874010 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-24vs5"] Feb 17 14:01:50 crc kubenswrapper[4804]: I0217 14:01:50.925234 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-catalog-content\") pod \"redhat-marketplace-24vs5\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:01:50 crc kubenswrapper[4804]: I0217 14:01:50.925614 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqxxq\" (UniqueName: \"kubernetes.io/projected/5a890ce7-1c49-42c1-8158-a8bb6df28bca-kube-api-access-vqxxq\") pod \"redhat-marketplace-24vs5\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:01:50 crc kubenswrapper[4804]: I0217 14:01:50.925832 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-utilities\") pod \"redhat-marketplace-24vs5\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:01:51 crc kubenswrapper[4804]: I0217 14:01:51.027774 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqxxq\" (UniqueName: \"kubernetes.io/projected/5a890ce7-1c49-42c1-8158-a8bb6df28bca-kube-api-access-vqxxq\") pod \"redhat-marketplace-24vs5\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:01:51 crc kubenswrapper[4804]: I0217 14:01:51.028036 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-utilities\") pod \"redhat-marketplace-24vs5\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:01:51 crc kubenswrapper[4804]: I0217 14:01:51.028652 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-utilities\") pod \"redhat-marketplace-24vs5\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:01:51 crc kubenswrapper[4804]: I0217 14:01:51.028817 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-catalog-content\") pod \"redhat-marketplace-24vs5\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:01:51 crc kubenswrapper[4804]: I0217 14:01:51.029267 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-catalog-content\") pod \"redhat-marketplace-24vs5\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:01:51 crc kubenswrapper[4804]: I0217 14:01:51.045820 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqxxq\" (UniqueName: \"kubernetes.io/projected/5a890ce7-1c49-42c1-8158-a8bb6df28bca-kube-api-access-vqxxq\") pod \"redhat-marketplace-24vs5\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:01:51 crc kubenswrapper[4804]: I0217 14:01:51.190390 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:01:51 crc kubenswrapper[4804]: I0217 14:01:51.836328 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-24vs5"] Feb 17 14:01:52 crc kubenswrapper[4804]: I0217 14:01:52.137707 4804 generic.go:334] "Generic (PLEG): container finished" podID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" containerID="c37b6a1bfe6114a1583bf62c3293d53ff353116a92960724b6d89ab3ee891483" exitCode=0 Feb 17 14:01:52 crc kubenswrapper[4804]: I0217 14:01:52.137810 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-24vs5" event={"ID":"5a890ce7-1c49-42c1-8158-a8bb6df28bca","Type":"ContainerDied","Data":"c37b6a1bfe6114a1583bf62c3293d53ff353116a92960724b6d89ab3ee891483"} Feb 17 14:01:52 crc kubenswrapper[4804]: I0217 14:01:52.138113 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-24vs5" event={"ID":"5a890ce7-1c49-42c1-8158-a8bb6df28bca","Type":"ContainerStarted","Data":"83d3e06f4d72a2eb7be71704c6207c2d36fbeeedb06f9d779450bce36a4899aa"} Feb 17 14:01:53 crc kubenswrapper[4804]: I0217 14:01:53.148690 4804 generic.go:334] "Generic (PLEG): container finished" podID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" containerID="974daf2e291d2c27e5ba0ebab79e8399e76718a9acab7f2630161a0ccc74d58a" exitCode=0 Feb 17 14:01:53 crc kubenswrapper[4804]: I0217 14:01:53.148764 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-24vs5" event={"ID":"5a890ce7-1c49-42c1-8158-a8bb6df28bca","Type":"ContainerDied","Data":"974daf2e291d2c27e5ba0ebab79e8399e76718a9acab7f2630161a0ccc74d58a"} Feb 17 14:01:54 crc kubenswrapper[4804]: I0217 14:01:54.158694 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-24vs5" event={"ID":"5a890ce7-1c49-42c1-8158-a8bb6df28bca","Type":"ContainerStarted","Data":"74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3"} Feb 17 14:01:54 crc kubenswrapper[4804]: I0217 14:01:54.176829 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-24vs5" podStartSLOduration=2.550049535 podStartE2EDuration="4.176793519s" podCreationTimestamp="2026-02-17 14:01:50 +0000 UTC" firstStartedPulling="2026-02-17 14:01:52.139408885 +0000 UTC m=+2186.250828222" lastFinishedPulling="2026-02-17 14:01:53.766152869 +0000 UTC m=+2187.877572206" observedRunningTime="2026-02-17 14:01:54.175809817 +0000 UTC m=+2188.287229154" watchObservedRunningTime="2026-02-17 14:01:54.176793519 +0000 UTC m=+2188.288212856" Feb 17 14:01:55 crc kubenswrapper[4804]: I0217 14:01:55.835007 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:01:55 crc kubenswrapper[4804]: I0217 14:01:55.835279 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:01:55 crc kubenswrapper[4804]: I0217 14:01:55.835321 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 14:01:55 crc kubenswrapper[4804]: I0217 14:01:55.836038 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8c6714459fe07a3c9e4e4659fffd6afce8b955eae8b1de9a8c8da55e663ec16f"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:01:55 crc kubenswrapper[4804]: I0217 14:01:55.836088 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://8c6714459fe07a3c9e4e4659fffd6afce8b955eae8b1de9a8c8da55e663ec16f" gracePeriod=600 Feb 17 14:01:56 crc kubenswrapper[4804]: I0217 14:01:56.184002 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="8c6714459fe07a3c9e4e4659fffd6afce8b955eae8b1de9a8c8da55e663ec16f" exitCode=0 Feb 17 14:01:56 crc kubenswrapper[4804]: I0217 14:01:56.184335 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"8c6714459fe07a3c9e4e4659fffd6afce8b955eae8b1de9a8c8da55e663ec16f"} Feb 17 14:01:56 crc kubenswrapper[4804]: I0217 14:01:56.184371 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 14:01:57 crc kubenswrapper[4804]: I0217 14:01:57.194854 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69"} Feb 17 14:02:01 crc kubenswrapper[4804]: I0217 14:02:01.191378 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:02:01 crc kubenswrapper[4804]: I0217 14:02:01.191819 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:02:01 crc kubenswrapper[4804]: I0217 14:02:01.244576 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:02:01 crc kubenswrapper[4804]: I0217 14:02:01.298032 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:02:01 crc kubenswrapper[4804]: I0217 14:02:01.502794 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-24vs5"] Feb 17 14:02:03 crc kubenswrapper[4804]: I0217 14:02:03.248969 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-24vs5" podUID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" containerName="registry-server" containerID="cri-o://74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3" gracePeriod=2 Feb 17 14:02:03 crc kubenswrapper[4804]: I0217 14:02:03.755629 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:02:03 crc kubenswrapper[4804]: I0217 14:02:03.900265 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqxxq\" (UniqueName: \"kubernetes.io/projected/5a890ce7-1c49-42c1-8158-a8bb6df28bca-kube-api-access-vqxxq\") pod \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " Feb 17 14:02:03 crc kubenswrapper[4804]: I0217 14:02:03.900403 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-catalog-content\") pod \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " Feb 17 14:02:03 crc kubenswrapper[4804]: I0217 14:02:03.900468 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-utilities\") pod \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " Feb 17 14:02:03 crc kubenswrapper[4804]: I0217 14:02:03.901551 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-utilities" (OuterVolumeSpecName: "utilities") pod "5a890ce7-1c49-42c1-8158-a8bb6df28bca" (UID: "5a890ce7-1c49-42c1-8158-a8bb6df28bca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:02:03 crc kubenswrapper[4804]: I0217 14:02:03.905840 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a890ce7-1c49-42c1-8158-a8bb6df28bca-kube-api-access-vqxxq" (OuterVolumeSpecName: "kube-api-access-vqxxq") pod "5a890ce7-1c49-42c1-8158-a8bb6df28bca" (UID: "5a890ce7-1c49-42c1-8158-a8bb6df28bca"). InnerVolumeSpecName "kube-api-access-vqxxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:02:03 crc kubenswrapper[4804]: I0217 14:02:03.928794 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a890ce7-1c49-42c1-8158-a8bb6df28bca" (UID: "5a890ce7-1c49-42c1-8158-a8bb6df28bca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.002347 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqxxq\" (UniqueName: \"kubernetes.io/projected/5a890ce7-1c49-42c1-8158-a8bb6df28bca-kube-api-access-vqxxq\") on node \"crc\" DevicePath \"\"" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.002383 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.002393 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.260225 4804 generic.go:334] "Generic (PLEG): container finished" podID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" containerID="74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3" exitCode=0 Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.260278 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.260298 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-24vs5" event={"ID":"5a890ce7-1c49-42c1-8158-a8bb6df28bca","Type":"ContainerDied","Data":"74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3"} Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.260770 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-24vs5" event={"ID":"5a890ce7-1c49-42c1-8158-a8bb6df28bca","Type":"ContainerDied","Data":"83d3e06f4d72a2eb7be71704c6207c2d36fbeeedb06f9d779450bce36a4899aa"} Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.260812 4804 scope.go:117] "RemoveContainer" containerID="74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.297432 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-24vs5"] Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.309439 4804 scope.go:117] "RemoveContainer" containerID="974daf2e291d2c27e5ba0ebab79e8399e76718a9acab7f2630161a0ccc74d58a" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.310794 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-24vs5"] Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.327693 4804 scope.go:117] "RemoveContainer" containerID="c37b6a1bfe6114a1583bf62c3293d53ff353116a92960724b6d89ab3ee891483" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.370626 4804 scope.go:117] "RemoveContainer" containerID="74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3" Feb 17 14:02:04 crc kubenswrapper[4804]: E0217 14:02:04.371130 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3\": container with ID starting with 74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3 not found: ID does not exist" containerID="74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.371168 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3"} err="failed to get container status \"74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3\": rpc error: code = NotFound desc = could not find container \"74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3\": container with ID starting with 74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3 not found: ID does not exist" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.371238 4804 scope.go:117] "RemoveContainer" containerID="974daf2e291d2c27e5ba0ebab79e8399e76718a9acab7f2630161a0ccc74d58a" Feb 17 14:02:04 crc kubenswrapper[4804]: E0217 14:02:04.371672 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"974daf2e291d2c27e5ba0ebab79e8399e76718a9acab7f2630161a0ccc74d58a\": container with ID starting with 974daf2e291d2c27e5ba0ebab79e8399e76718a9acab7f2630161a0ccc74d58a not found: ID does not exist" containerID="974daf2e291d2c27e5ba0ebab79e8399e76718a9acab7f2630161a0ccc74d58a" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.371790 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"974daf2e291d2c27e5ba0ebab79e8399e76718a9acab7f2630161a0ccc74d58a"} err="failed to get container status \"974daf2e291d2c27e5ba0ebab79e8399e76718a9acab7f2630161a0ccc74d58a\": rpc error: code = NotFound desc = could not find container \"974daf2e291d2c27e5ba0ebab79e8399e76718a9acab7f2630161a0ccc74d58a\": container with ID starting with 974daf2e291d2c27e5ba0ebab79e8399e76718a9acab7f2630161a0ccc74d58a not found: ID does not exist" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.371879 4804 scope.go:117] "RemoveContainer" containerID="c37b6a1bfe6114a1583bf62c3293d53ff353116a92960724b6d89ab3ee891483" Feb 17 14:02:04 crc kubenswrapper[4804]: E0217 14:02:04.372252 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c37b6a1bfe6114a1583bf62c3293d53ff353116a92960724b6d89ab3ee891483\": container with ID starting with c37b6a1bfe6114a1583bf62c3293d53ff353116a92960724b6d89ab3ee891483 not found: ID does not exist" containerID="c37b6a1bfe6114a1583bf62c3293d53ff353116a92960724b6d89ab3ee891483" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.372348 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c37b6a1bfe6114a1583bf62c3293d53ff353116a92960724b6d89ab3ee891483"} err="failed to get container status \"c37b6a1bfe6114a1583bf62c3293d53ff353116a92960724b6d89ab3ee891483\": rpc error: code = NotFound desc = could not find container \"c37b6a1bfe6114a1583bf62c3293d53ff353116a92960724b6d89ab3ee891483\": container with ID starting with c37b6a1bfe6114a1583bf62c3293d53ff353116a92960724b6d89ab3ee891483 not found: ID does not exist" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.584762 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" path="/var/lib/kubelet/pods/5a890ce7-1c49-42c1-8158-a8bb6df28bca/volumes" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.508057 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8p6ls"] Feb 17 14:03:01 crc kubenswrapper[4804]: E0217 14:03:01.511120 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" containerName="extract-content" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.511157 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" containerName="extract-content" Feb 17 14:03:01 crc kubenswrapper[4804]: E0217 14:03:01.511252 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" containerName="extract-utilities" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.511264 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" containerName="extract-utilities" Feb 17 14:03:01 crc kubenswrapper[4804]: E0217 14:03:01.511281 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" containerName="registry-server" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.511291 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" containerName="registry-server" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.511659 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" containerName="registry-server" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.513846 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.528125 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8p6ls"] Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.612237 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-utilities\") pod \"community-operators-8p6ls\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.612341 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqmtm\" (UniqueName: \"kubernetes.io/projected/0901d547-00b8-45f5-b76c-d3a87cf88ee3-kube-api-access-tqmtm\") pod \"community-operators-8p6ls\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.612398 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-catalog-content\") pod \"community-operators-8p6ls\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.713815 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-utilities\") pod \"community-operators-8p6ls\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.714339 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-utilities\") pod \"community-operators-8p6ls\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.714608 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqmtm\" (UniqueName: \"kubernetes.io/projected/0901d547-00b8-45f5-b76c-d3a87cf88ee3-kube-api-access-tqmtm\") pod \"community-operators-8p6ls\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.714944 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-catalog-content\") pod \"community-operators-8p6ls\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.715326 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-catalog-content\") pod \"community-operators-8p6ls\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.742265 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqmtm\" (UniqueName: \"kubernetes.io/projected/0901d547-00b8-45f5-b76c-d3a87cf88ee3-kube-api-access-tqmtm\") pod \"community-operators-8p6ls\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.835058 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:02 crc kubenswrapper[4804]: I0217 14:03:02.366350 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8p6ls"] Feb 17 14:03:03 crc kubenswrapper[4804]: I0217 14:03:03.201545 4804 generic.go:334] "Generic (PLEG): container finished" podID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" containerID="cb7e2b37355090860e3e9a91b1464c77c4023b6564b4a1bf88043aa221cc5977" exitCode=0 Feb 17 14:03:03 crc kubenswrapper[4804]: I0217 14:03:03.201908 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p6ls" event={"ID":"0901d547-00b8-45f5-b76c-d3a87cf88ee3","Type":"ContainerDied","Data":"cb7e2b37355090860e3e9a91b1464c77c4023b6564b4a1bf88043aa221cc5977"} Feb 17 14:03:03 crc kubenswrapper[4804]: I0217 14:03:03.202134 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p6ls" event={"ID":"0901d547-00b8-45f5-b76c-d3a87cf88ee3","Type":"ContainerStarted","Data":"675f3b5cd0c50bc790b7d5e0a3f30a92a499a3872cebb2edb676d0e9e0b963ee"} Feb 17 14:03:04 crc kubenswrapper[4804]: I0217 14:03:04.213470 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p6ls" event={"ID":"0901d547-00b8-45f5-b76c-d3a87cf88ee3","Type":"ContainerStarted","Data":"eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe"} Feb 17 14:03:05 crc kubenswrapper[4804]: I0217 14:03:05.225038 4804 generic.go:334] "Generic (PLEG): container finished" podID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" containerID="eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe" exitCode=0 Feb 17 14:03:05 crc kubenswrapper[4804]: I0217 14:03:05.225091 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p6ls" event={"ID":"0901d547-00b8-45f5-b76c-d3a87cf88ee3","Type":"ContainerDied","Data":"eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe"} Feb 17 14:03:06 crc kubenswrapper[4804]: I0217 14:03:06.239398 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p6ls" event={"ID":"0901d547-00b8-45f5-b76c-d3a87cf88ee3","Type":"ContainerStarted","Data":"b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f"} Feb 17 14:03:06 crc kubenswrapper[4804]: I0217 14:03:06.267575 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8p6ls" podStartSLOduration=2.861565206 podStartE2EDuration="5.267534833s" podCreationTimestamp="2026-02-17 14:03:01 +0000 UTC" firstStartedPulling="2026-02-17 14:03:03.205183912 +0000 UTC m=+2257.316603289" lastFinishedPulling="2026-02-17 14:03:05.611153569 +0000 UTC m=+2259.722572916" observedRunningTime="2026-02-17 14:03:06.258809161 +0000 UTC m=+2260.370228508" watchObservedRunningTime="2026-02-17 14:03:06.267534833 +0000 UTC m=+2260.378954160" Feb 17 14:03:11 crc kubenswrapper[4804]: I0217 14:03:11.835485 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:11 crc kubenswrapper[4804]: I0217 14:03:11.836081 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:11 crc kubenswrapper[4804]: I0217 14:03:11.888051 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:12 crc kubenswrapper[4804]: I0217 14:03:12.366605 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:12 crc kubenswrapper[4804]: I0217 14:03:12.423052 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8p6ls"] Feb 17 14:03:14 crc kubenswrapper[4804]: I0217 14:03:14.314423 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8p6ls" podUID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" containerName="registry-server" containerID="cri-o://b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f" gracePeriod=2 Feb 17 14:03:14 crc kubenswrapper[4804]: I0217 14:03:14.783977 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:14 crc kubenswrapper[4804]: I0217 14:03:14.868157 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-catalog-content\") pod \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " Feb 17 14:03:14 crc kubenswrapper[4804]: I0217 14:03:14.868299 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqmtm\" (UniqueName: \"kubernetes.io/projected/0901d547-00b8-45f5-b76c-d3a87cf88ee3-kube-api-access-tqmtm\") pod \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " Feb 17 14:03:14 crc kubenswrapper[4804]: I0217 14:03:14.868626 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-utilities\") pod \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " Feb 17 14:03:14 crc kubenswrapper[4804]: I0217 14:03:14.869432 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-utilities" (OuterVolumeSpecName: "utilities") pod "0901d547-00b8-45f5-b76c-d3a87cf88ee3" (UID: "0901d547-00b8-45f5-b76c-d3a87cf88ee3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:03:14 crc kubenswrapper[4804]: I0217 14:03:14.874607 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0901d547-00b8-45f5-b76c-d3a87cf88ee3-kube-api-access-tqmtm" (OuterVolumeSpecName: "kube-api-access-tqmtm") pod "0901d547-00b8-45f5-b76c-d3a87cf88ee3" (UID: "0901d547-00b8-45f5-b76c-d3a87cf88ee3"). InnerVolumeSpecName "kube-api-access-tqmtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:03:14 crc kubenswrapper[4804]: I0217 14:03:14.930142 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0901d547-00b8-45f5-b76c-d3a87cf88ee3" (UID: "0901d547-00b8-45f5-b76c-d3a87cf88ee3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:03:14 crc kubenswrapper[4804]: I0217 14:03:14.971520 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:03:14 crc kubenswrapper[4804]: I0217 14:03:14.971559 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:03:14 crc kubenswrapper[4804]: I0217 14:03:14.971571 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqmtm\" (UniqueName: \"kubernetes.io/projected/0901d547-00b8-45f5-b76c-d3a87cf88ee3-kube-api-access-tqmtm\") on node \"crc\" DevicePath \"\"" Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.328916 4804 generic.go:334] "Generic (PLEG): container finished" podID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" containerID="b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f" exitCode=0 Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.328984 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p6ls" event={"ID":"0901d547-00b8-45f5-b76c-d3a87cf88ee3","Type":"ContainerDied","Data":"b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f"} Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.329024 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p6ls" event={"ID":"0901d547-00b8-45f5-b76c-d3a87cf88ee3","Type":"ContainerDied","Data":"675f3b5cd0c50bc790b7d5e0a3f30a92a499a3872cebb2edb676d0e9e0b963ee"} Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.329058 4804 scope.go:117] "RemoveContainer" containerID="b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f" Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.329304 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.375407 4804 scope.go:117] "RemoveContainer" containerID="eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe" Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.381862 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8p6ls"] Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.390994 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8p6ls"] Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.411452 4804 scope.go:117] "RemoveContainer" containerID="cb7e2b37355090860e3e9a91b1464c77c4023b6564b4a1bf88043aa221cc5977" Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.439079 4804 scope.go:117] "RemoveContainer" containerID="b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f" Feb 17 14:03:15 crc kubenswrapper[4804]: E0217 14:03:15.439685 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f\": container with ID starting with b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f not found: ID does not exist" containerID="b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f" Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.439733 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f"} err="failed to get container status \"b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f\": rpc error: code = NotFound desc = could not find container \"b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f\": container with ID starting with b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f not found: ID does not exist" Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.439764 4804 scope.go:117] "RemoveContainer" containerID="eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe" Feb 17 14:03:15 crc kubenswrapper[4804]: E0217 14:03:15.440249 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe\": container with ID starting with eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe not found: ID does not exist" containerID="eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe" Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.440362 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe"} err="failed to get container status \"eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe\": rpc error: code = NotFound desc = could not find container \"eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe\": container with ID starting with eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe not found: ID does not exist" Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.440458 4804 scope.go:117] "RemoveContainer" containerID="cb7e2b37355090860e3e9a91b1464c77c4023b6564b4a1bf88043aa221cc5977" Feb 17 14:03:15 crc kubenswrapper[4804]: E0217 14:03:15.440798 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb7e2b37355090860e3e9a91b1464c77c4023b6564b4a1bf88043aa221cc5977\": container with ID starting with cb7e2b37355090860e3e9a91b1464c77c4023b6564b4a1bf88043aa221cc5977 not found: ID does not exist" containerID="cb7e2b37355090860e3e9a91b1464c77c4023b6564b4a1bf88043aa221cc5977" Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.440839 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb7e2b37355090860e3e9a91b1464c77c4023b6564b4a1bf88043aa221cc5977"} err="failed to get container status \"cb7e2b37355090860e3e9a91b1464c77c4023b6564b4a1bf88043aa221cc5977\": rpc error: code = NotFound desc = could not find container \"cb7e2b37355090860e3e9a91b1464c77c4023b6564b4a1bf88043aa221cc5977\": container with ID starting with cb7e2b37355090860e3e9a91b1464c77c4023b6564b4a1bf88043aa221cc5977 not found: ID does not exist" Feb 17 14:03:16 crc kubenswrapper[4804]: I0217 14:03:16.591820 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" path="/var/lib/kubelet/pods/0901d547-00b8-45f5-b76c-d3a87cf88ee3/volumes" Feb 17 14:04:25 crc kubenswrapper[4804]: I0217 14:04:25.836058 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:04:25 crc kubenswrapper[4804]: I0217 14:04:25.836714 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:04:55 crc kubenswrapper[4804]: I0217 14:04:55.835476 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:04:55 crc kubenswrapper[4804]: I0217 14:04:55.837448 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:05:06 crc kubenswrapper[4804]: I0217 14:05:06.446981 4804 generic.go:334] "Generic (PLEG): container finished" podID="c0aad2ba-98cf-42b5-9c03-40633fb8ac18" containerID="8ce565034da923c62aa35b8a82d937d994fde79d28a308a124ad2648ce45eeca" exitCode=0 Feb 17 14:05:06 crc kubenswrapper[4804]: I0217 14:05:06.447070 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" event={"ID":"c0aad2ba-98cf-42b5-9c03-40633fb8ac18","Type":"ContainerDied","Data":"8ce565034da923c62aa35b8a82d937d994fde79d28a308a124ad2648ce45eeca"} Feb 17 14:05:07 crc kubenswrapper[4804]: I0217 14:05:07.855335 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:05:07 crc kubenswrapper[4804]: I0217 14:05:07.990236 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-ssh-key-openstack-edpm-ipam\") pod \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " Feb 17 14:05:07 crc kubenswrapper[4804]: I0217 14:05:07.990302 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-secret-0\") pod \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " Feb 17 14:05:07 crc kubenswrapper[4804]: I0217 14:05:07.990373 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-combined-ca-bundle\") pod \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " Feb 17 14:05:07 crc kubenswrapper[4804]: I0217 14:05:07.990405 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bzfk\" (UniqueName: \"kubernetes.io/projected/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-kube-api-access-6bzfk\") pod \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " Feb 17 14:05:07 crc kubenswrapper[4804]: I0217 14:05:07.990475 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-inventory\") pod \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.002783 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "c0aad2ba-98cf-42b5-9c03-40633fb8ac18" (UID: "c0aad2ba-98cf-42b5-9c03-40633fb8ac18"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.005245 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-kube-api-access-6bzfk" (OuterVolumeSpecName: "kube-api-access-6bzfk") pod "c0aad2ba-98cf-42b5-9c03-40633fb8ac18" (UID: "c0aad2ba-98cf-42b5-9c03-40633fb8ac18"). InnerVolumeSpecName "kube-api-access-6bzfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.026860 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c0aad2ba-98cf-42b5-9c03-40633fb8ac18" (UID: "c0aad2ba-98cf-42b5-9c03-40633fb8ac18"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.029622 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "c0aad2ba-98cf-42b5-9c03-40633fb8ac18" (UID: "c0aad2ba-98cf-42b5-9c03-40633fb8ac18"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.034387 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-inventory" (OuterVolumeSpecName: "inventory") pod "c0aad2ba-98cf-42b5-9c03-40633fb8ac18" (UID: "c0aad2ba-98cf-42b5-9c03-40633fb8ac18"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.092743 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.092943 4804 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.093030 4804 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.093096 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bzfk\" (UniqueName: \"kubernetes.io/projected/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-kube-api-access-6bzfk\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.093154 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.465518 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" event={"ID":"c0aad2ba-98cf-42b5-9c03-40633fb8ac18","Type":"ContainerDied","Data":"b85edd5f6c172e3fc7186590e69abc45c66e58877217d9c682e3a1e6773d16ec"} Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.465555 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b85edd5f6c172e3fc7186590e69abc45c66e58877217d9c682e3a1e6773d16ec" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.465622 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.572095 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml"] Feb 17 14:05:08 crc kubenswrapper[4804]: E0217 14:05:08.572496 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" containerName="extract-content" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.572511 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" containerName="extract-content" Feb 17 14:05:08 crc kubenswrapper[4804]: E0217 14:05:08.572530 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" containerName="registry-server" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.572536 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" containerName="registry-server" Feb 17 14:05:08 crc kubenswrapper[4804]: E0217 14:05:08.572553 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" containerName="extract-utilities" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.572560 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" containerName="extract-utilities" Feb 17 14:05:08 crc kubenswrapper[4804]: E0217 14:05:08.572578 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0aad2ba-98cf-42b5-9c03-40633fb8ac18" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.572585 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0aad2ba-98cf-42b5-9c03-40633fb8ac18" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.572750 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0aad2ba-98cf-42b5-9c03-40633fb8ac18" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.572758 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" containerName="registry-server" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.578933 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.584482 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.584496 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.584844 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.585050 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.585369 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.585539 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.592584 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml"] Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.592776 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.704357 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv28f\" (UniqueName: \"kubernetes.io/projected/9f17dd92-0402-40c7-bdc7-50b38e37f750-kube-api-access-gv28f\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.704434 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.704462 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.704596 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.704649 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.704683 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.704752 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.704933 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.704994 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.806751 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.806860 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv28f\" (UniqueName: \"kubernetes.io/projected/9f17dd92-0402-40c7-bdc7-50b38e37f750-kube-api-access-gv28f\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.806909 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.806935 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.807027 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.807061 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.807183 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.807668 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.807714 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.807987 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.811972 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.812026 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.812099 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.812179 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.812379 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.816646 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.818355 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.822884 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv28f\" (UniqueName: \"kubernetes.io/projected/9f17dd92-0402-40c7-bdc7-50b38e37f750-kube-api-access-gv28f\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.911243 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:09 crc kubenswrapper[4804]: I0217 14:05:09.451025 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml"] Feb 17 14:05:09 crc kubenswrapper[4804]: I0217 14:05:09.476580 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" event={"ID":"9f17dd92-0402-40c7-bdc7-50b38e37f750","Type":"ContainerStarted","Data":"3737f6f0131a3f5e82616cb8a9012b910af27541cdeac5c9048e6ea1b4d2299d"} Feb 17 14:05:10 crc kubenswrapper[4804]: I0217 14:05:10.486469 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" event={"ID":"9f17dd92-0402-40c7-bdc7-50b38e37f750","Type":"ContainerStarted","Data":"dbda66ccca14c400ac04b20a535081d1e040f266a1132798c4aceb72485b84fa"} Feb 17 14:05:10 crc kubenswrapper[4804]: I0217 14:05:10.509324 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" podStartSLOduration=2.346156847 podStartE2EDuration="2.509300384s" podCreationTimestamp="2026-02-17 14:05:08 +0000 UTC" firstStartedPulling="2026-02-17 14:05:09.467926161 +0000 UTC m=+2383.579345498" lastFinishedPulling="2026-02-17 14:05:09.631069708 +0000 UTC m=+2383.742489035" observedRunningTime="2026-02-17 14:05:10.504495584 +0000 UTC m=+2384.615914931" watchObservedRunningTime="2026-02-17 14:05:10.509300384 +0000 UTC m=+2384.620719721" Feb 17 14:05:25 crc kubenswrapper[4804]: I0217 14:05:25.836091 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:05:25 crc kubenswrapper[4804]: I0217 14:05:25.837229 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:05:25 crc kubenswrapper[4804]: I0217 14:05:25.837303 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 14:05:25 crc kubenswrapper[4804]: I0217 14:05:25.838570 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:05:25 crc kubenswrapper[4804]: I0217 14:05:25.838713 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" gracePeriod=600 Feb 17 14:05:25 crc kubenswrapper[4804]: E0217 14:05:25.961435 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:05:26 crc kubenswrapper[4804]: I0217 14:05:26.623112 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" exitCode=0 Feb 17 14:05:26 crc kubenswrapper[4804]: I0217 14:05:26.623460 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69"} Feb 17 14:05:26 crc kubenswrapper[4804]: I0217 14:05:26.623492 4804 scope.go:117] "RemoveContainer" containerID="8c6714459fe07a3c9e4e4659fffd6afce8b955eae8b1de9a8c8da55e663ec16f" Feb 17 14:05:26 crc kubenswrapper[4804]: I0217 14:05:26.624072 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:05:26 crc kubenswrapper[4804]: E0217 14:05:26.624412 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.278620 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2czqf"] Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.280927 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.354737 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2czqf"] Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.449081 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-catalog-content\") pod \"certified-operators-2czqf\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.449248 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-utilities\") pod \"certified-operators-2czqf\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.449300 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29lf6\" (UniqueName: \"kubernetes.io/projected/4d76421b-4776-498f-a065-58f55d0e6e19-kube-api-access-29lf6\") pod \"certified-operators-2czqf\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.550740 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-utilities\") pod \"certified-operators-2czqf\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.550804 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29lf6\" (UniqueName: \"kubernetes.io/projected/4d76421b-4776-498f-a065-58f55d0e6e19-kube-api-access-29lf6\") pod \"certified-operators-2czqf\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.550901 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-catalog-content\") pod \"certified-operators-2czqf\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.551418 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-catalog-content\") pod \"certified-operators-2czqf\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.551634 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-utilities\") pod \"certified-operators-2czqf\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.572593 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29lf6\" (UniqueName: \"kubernetes.io/projected/4d76421b-4776-498f-a065-58f55d0e6e19-kube-api-access-29lf6\") pod \"certified-operators-2czqf\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.652226 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:30 crc kubenswrapper[4804]: I0217 14:05:30.118953 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2czqf"] Feb 17 14:05:30 crc kubenswrapper[4804]: I0217 14:05:30.658766 4804 generic.go:334] "Generic (PLEG): container finished" podID="4d76421b-4776-498f-a065-58f55d0e6e19" containerID="1a4df0575144925b5b5b52e0b974d9ed4ae42a22497d6b980d5cf223b19a02fb" exitCode=0 Feb 17 14:05:30 crc kubenswrapper[4804]: I0217 14:05:30.658838 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2czqf" event={"ID":"4d76421b-4776-498f-a065-58f55d0e6e19","Type":"ContainerDied","Data":"1a4df0575144925b5b5b52e0b974d9ed4ae42a22497d6b980d5cf223b19a02fb"} Feb 17 14:05:30 crc kubenswrapper[4804]: I0217 14:05:30.658902 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2czqf" event={"ID":"4d76421b-4776-498f-a065-58f55d0e6e19","Type":"ContainerStarted","Data":"7ad75112ffb62ecf766d5c438a715c9d96e81a84818088bc4b50cdcb499f5951"} Feb 17 14:05:31 crc kubenswrapper[4804]: I0217 14:05:31.684675 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2czqf" event={"ID":"4d76421b-4776-498f-a065-58f55d0e6e19","Type":"ContainerStarted","Data":"6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e"} Feb 17 14:05:32 crc kubenswrapper[4804]: I0217 14:05:32.696622 4804 generic.go:334] "Generic (PLEG): container finished" podID="4d76421b-4776-498f-a065-58f55d0e6e19" containerID="6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e" exitCode=0 Feb 17 14:05:32 crc kubenswrapper[4804]: I0217 14:05:32.696681 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2czqf" event={"ID":"4d76421b-4776-498f-a065-58f55d0e6e19","Type":"ContainerDied","Data":"6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e"} Feb 17 14:05:33 crc kubenswrapper[4804]: I0217 14:05:33.707060 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2czqf" event={"ID":"4d76421b-4776-498f-a065-58f55d0e6e19","Type":"ContainerStarted","Data":"f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194"} Feb 17 14:05:33 crc kubenswrapper[4804]: I0217 14:05:33.730379 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2czqf" podStartSLOduration=2.003229686 podStartE2EDuration="4.730362935s" podCreationTimestamp="2026-02-17 14:05:29 +0000 UTC" firstStartedPulling="2026-02-17 14:05:30.660699477 +0000 UTC m=+2404.772118814" lastFinishedPulling="2026-02-17 14:05:33.387832726 +0000 UTC m=+2407.499252063" observedRunningTime="2026-02-17 14:05:33.721997946 +0000 UTC m=+2407.833417293" watchObservedRunningTime="2026-02-17 14:05:33.730362935 +0000 UTC m=+2407.841782272" Feb 17 14:05:37 crc kubenswrapper[4804]: I0217 14:05:37.574120 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:05:37 crc kubenswrapper[4804]: E0217 14:05:37.574727 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:05:39 crc kubenswrapper[4804]: I0217 14:05:39.653821 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:39 crc kubenswrapper[4804]: I0217 14:05:39.654153 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:39 crc kubenswrapper[4804]: I0217 14:05:39.704801 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:39 crc kubenswrapper[4804]: I0217 14:05:39.824187 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:39 crc kubenswrapper[4804]: I0217 14:05:39.942173 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2czqf"] Feb 17 14:05:41 crc kubenswrapper[4804]: I0217 14:05:41.790892 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2czqf" podUID="4d76421b-4776-498f-a065-58f55d0e6e19" containerName="registry-server" containerID="cri-o://f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194" gracePeriod=2 Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.264657 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.424009 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-utilities\") pod \"4d76421b-4776-498f-a065-58f55d0e6e19\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.424155 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-catalog-content\") pod \"4d76421b-4776-498f-a065-58f55d0e6e19\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.424243 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29lf6\" (UniqueName: \"kubernetes.io/projected/4d76421b-4776-498f-a065-58f55d0e6e19-kube-api-access-29lf6\") pod \"4d76421b-4776-498f-a065-58f55d0e6e19\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.427178 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-utilities" (OuterVolumeSpecName: "utilities") pod "4d76421b-4776-498f-a065-58f55d0e6e19" (UID: "4d76421b-4776-498f-a065-58f55d0e6e19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.431480 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d76421b-4776-498f-a065-58f55d0e6e19-kube-api-access-29lf6" (OuterVolumeSpecName: "kube-api-access-29lf6") pod "4d76421b-4776-498f-a065-58f55d0e6e19" (UID: "4d76421b-4776-498f-a065-58f55d0e6e19"). InnerVolumeSpecName "kube-api-access-29lf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.488714 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d76421b-4776-498f-a065-58f55d0e6e19" (UID: "4d76421b-4776-498f-a065-58f55d0e6e19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.527835 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.528056 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.528172 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29lf6\" (UniqueName: \"kubernetes.io/projected/4d76421b-4776-498f-a065-58f55d0e6e19-kube-api-access-29lf6\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.800836 4804 generic.go:334] "Generic (PLEG): container finished" podID="4d76421b-4776-498f-a065-58f55d0e6e19" containerID="f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194" exitCode=0 Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.800900 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.800902 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2czqf" event={"ID":"4d76421b-4776-498f-a065-58f55d0e6e19","Type":"ContainerDied","Data":"f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194"} Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.801284 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2czqf" event={"ID":"4d76421b-4776-498f-a065-58f55d0e6e19","Type":"ContainerDied","Data":"7ad75112ffb62ecf766d5c438a715c9d96e81a84818088bc4b50cdcb499f5951"} Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.801330 4804 scope.go:117] "RemoveContainer" containerID="f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.898641 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2czqf"] Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.908513 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2czqf"] Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.915102 4804 scope.go:117] "RemoveContainer" containerID="6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.941089 4804 scope.go:117] "RemoveContainer" containerID="1a4df0575144925b5b5b52e0b974d9ed4ae42a22497d6b980d5cf223b19a02fb" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.985895 4804 scope.go:117] "RemoveContainer" containerID="f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194" Feb 17 14:05:42 crc kubenswrapper[4804]: E0217 14:05:42.986322 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194\": container with ID starting with f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194 not found: ID does not exist" containerID="f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.986363 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194"} err="failed to get container status \"f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194\": rpc error: code = NotFound desc = could not find container \"f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194\": container with ID starting with f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194 not found: ID does not exist" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.986389 4804 scope.go:117] "RemoveContainer" containerID="6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e" Feb 17 14:05:42 crc kubenswrapper[4804]: E0217 14:05:42.986604 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e\": container with ID starting with 6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e not found: ID does not exist" containerID="6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.986627 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e"} err="failed to get container status \"6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e\": rpc error: code = NotFound desc = could not find container \"6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e\": container with ID starting with 6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e not found: ID does not exist" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.986642 4804 scope.go:117] "RemoveContainer" containerID="1a4df0575144925b5b5b52e0b974d9ed4ae42a22497d6b980d5cf223b19a02fb" Feb 17 14:05:42 crc kubenswrapper[4804]: E0217 14:05:42.986842 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a4df0575144925b5b5b52e0b974d9ed4ae42a22497d6b980d5cf223b19a02fb\": container with ID starting with 1a4df0575144925b5b5b52e0b974d9ed4ae42a22497d6b980d5cf223b19a02fb not found: ID does not exist" containerID="1a4df0575144925b5b5b52e0b974d9ed4ae42a22497d6b980d5cf223b19a02fb" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.986869 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a4df0575144925b5b5b52e0b974d9ed4ae42a22497d6b980d5cf223b19a02fb"} err="failed to get container status \"1a4df0575144925b5b5b52e0b974d9ed4ae42a22497d6b980d5cf223b19a02fb\": rpc error: code = NotFound desc = could not find container \"1a4df0575144925b5b5b52e0b974d9ed4ae42a22497d6b980d5cf223b19a02fb\": container with ID starting with 1a4df0575144925b5b5b52e0b974d9ed4ae42a22497d6b980d5cf223b19a02fb not found: ID does not exist" Feb 17 14:05:44 crc kubenswrapper[4804]: I0217 14:05:44.595491 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d76421b-4776-498f-a065-58f55d0e6e19" path="/var/lib/kubelet/pods/4d76421b-4776-498f-a065-58f55d0e6e19/volumes" Feb 17 14:05:48 crc kubenswrapper[4804]: I0217 14:05:48.574042 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:05:48 crc kubenswrapper[4804]: E0217 14:05:48.575672 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:06:00 crc kubenswrapper[4804]: I0217 14:06:00.574063 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:06:00 crc kubenswrapper[4804]: E0217 14:06:00.574833 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:06:14 crc kubenswrapper[4804]: I0217 14:06:14.574627 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:06:14 crc kubenswrapper[4804]: E0217 14:06:14.575639 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:06:27 crc kubenswrapper[4804]: I0217 14:06:27.574537 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:06:27 crc kubenswrapper[4804]: E0217 14:06:27.576655 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.862934 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lhfth"] Feb 17 14:06:38 crc kubenswrapper[4804]: E0217 14:06:38.863971 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d76421b-4776-498f-a065-58f55d0e6e19" containerName="extract-content" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.863989 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d76421b-4776-498f-a065-58f55d0e6e19" containerName="extract-content" Feb 17 14:06:38 crc kubenswrapper[4804]: E0217 14:06:38.864019 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d76421b-4776-498f-a065-58f55d0e6e19" containerName="extract-utilities" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.864027 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d76421b-4776-498f-a065-58f55d0e6e19" containerName="extract-utilities" Feb 17 14:06:38 crc kubenswrapper[4804]: E0217 14:06:38.864044 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d76421b-4776-498f-a065-58f55d0e6e19" containerName="registry-server" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.864051 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d76421b-4776-498f-a065-58f55d0e6e19" containerName="registry-server" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.864274 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d76421b-4776-498f-a065-58f55d0e6e19" containerName="registry-server" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.865917 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.881964 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lhfth"] Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.885260 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-catalog-content\") pod \"redhat-operators-lhfth\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.885391 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-utilities\") pod \"redhat-operators-lhfth\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.885474 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdsv9\" (UniqueName: \"kubernetes.io/projected/185fa21c-049f-41b3-9031-318a3c21ecef-kube-api-access-qdsv9\") pod \"redhat-operators-lhfth\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.987217 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-catalog-content\") pod \"redhat-operators-lhfth\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.987690 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-catalog-content\") pod \"redhat-operators-lhfth\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.987706 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-utilities\") pod \"redhat-operators-lhfth\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.987785 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdsv9\" (UniqueName: \"kubernetes.io/projected/185fa21c-049f-41b3-9031-318a3c21ecef-kube-api-access-qdsv9\") pod \"redhat-operators-lhfth\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.987965 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-utilities\") pod \"redhat-operators-lhfth\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:39 crc kubenswrapper[4804]: I0217 14:06:39.010008 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdsv9\" (UniqueName: \"kubernetes.io/projected/185fa21c-049f-41b3-9031-318a3c21ecef-kube-api-access-qdsv9\") pod \"redhat-operators-lhfth\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:39 crc kubenswrapper[4804]: I0217 14:06:39.186566 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:39 crc kubenswrapper[4804]: I0217 14:06:39.574073 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:06:39 crc kubenswrapper[4804]: E0217 14:06:39.574698 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:06:39 crc kubenswrapper[4804]: I0217 14:06:39.701469 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lhfth"] Feb 17 14:06:40 crc kubenswrapper[4804]: I0217 14:06:40.375565 4804 generic.go:334] "Generic (PLEG): container finished" podID="185fa21c-049f-41b3-9031-318a3c21ecef" containerID="63d2f999f3b20ab902c761ba7009b771de4228c739dc86596893270097ddcd50" exitCode=0 Feb 17 14:06:40 crc kubenswrapper[4804]: I0217 14:06:40.375650 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhfth" event={"ID":"185fa21c-049f-41b3-9031-318a3c21ecef","Type":"ContainerDied","Data":"63d2f999f3b20ab902c761ba7009b771de4228c739dc86596893270097ddcd50"} Feb 17 14:06:40 crc kubenswrapper[4804]: I0217 14:06:40.375942 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhfth" event={"ID":"185fa21c-049f-41b3-9031-318a3c21ecef","Type":"ContainerStarted","Data":"71c0d4db4d6b00eb21e2b36531f6d666ffe8b21b661819d2b23f0bd4323aa817"} Feb 17 14:06:40 crc kubenswrapper[4804]: I0217 14:06:40.377968 4804 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:06:42 crc kubenswrapper[4804]: I0217 14:06:42.394378 4804 generic.go:334] "Generic (PLEG): container finished" podID="185fa21c-049f-41b3-9031-318a3c21ecef" containerID="f01193bdf1e82e2a6e3fea01bf103d3eb3adc6c861f2b2ac1692bb87fc0b6c46" exitCode=0 Feb 17 14:06:42 crc kubenswrapper[4804]: I0217 14:06:42.394434 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhfth" event={"ID":"185fa21c-049f-41b3-9031-318a3c21ecef","Type":"ContainerDied","Data":"f01193bdf1e82e2a6e3fea01bf103d3eb3adc6c861f2b2ac1692bb87fc0b6c46"} Feb 17 14:06:43 crc kubenswrapper[4804]: I0217 14:06:43.412406 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhfth" event={"ID":"185fa21c-049f-41b3-9031-318a3c21ecef","Type":"ContainerStarted","Data":"a4f78e134f4672850b3042a62c45a06ef12ddc5102c5309d4d677ecae6fb3127"} Feb 17 14:06:43 crc kubenswrapper[4804]: I0217 14:06:43.436540 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lhfth" podStartSLOduration=2.853122149 podStartE2EDuration="5.436518053s" podCreationTimestamp="2026-02-17 14:06:38 +0000 UTC" firstStartedPulling="2026-02-17 14:06:40.376981549 +0000 UTC m=+2474.488400886" lastFinishedPulling="2026-02-17 14:06:42.960377453 +0000 UTC m=+2477.071796790" observedRunningTime="2026-02-17 14:06:43.428606638 +0000 UTC m=+2477.540025975" watchObservedRunningTime="2026-02-17 14:06:43.436518053 +0000 UTC m=+2477.547937390" Feb 17 14:06:49 crc kubenswrapper[4804]: I0217 14:06:49.187231 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:49 crc kubenswrapper[4804]: I0217 14:06:49.189538 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:49 crc kubenswrapper[4804]: I0217 14:06:49.252776 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:49 crc kubenswrapper[4804]: I0217 14:06:49.543314 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:49 crc kubenswrapper[4804]: I0217 14:06:49.590845 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lhfth"] Feb 17 14:06:51 crc kubenswrapper[4804]: I0217 14:06:51.511217 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lhfth" podUID="185fa21c-049f-41b3-9031-318a3c21ecef" containerName="registry-server" containerID="cri-o://a4f78e134f4672850b3042a62c45a06ef12ddc5102c5309d4d677ecae6fb3127" gracePeriod=2 Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.526069 4804 generic.go:334] "Generic (PLEG): container finished" podID="185fa21c-049f-41b3-9031-318a3c21ecef" containerID="a4f78e134f4672850b3042a62c45a06ef12ddc5102c5309d4d677ecae6fb3127" exitCode=0 Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.526230 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhfth" event={"ID":"185fa21c-049f-41b3-9031-318a3c21ecef","Type":"ContainerDied","Data":"a4f78e134f4672850b3042a62c45a06ef12ddc5102c5309d4d677ecae6fb3127"} Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.526420 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhfth" event={"ID":"185fa21c-049f-41b3-9031-318a3c21ecef","Type":"ContainerDied","Data":"71c0d4db4d6b00eb21e2b36531f6d666ffe8b21b661819d2b23f0bd4323aa817"} Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.526442 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71c0d4db4d6b00eb21e2b36531f6d666ffe8b21b661819d2b23f0bd4323aa817" Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.574165 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:06:52 crc kubenswrapper[4804]: E0217 14:06:52.574552 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.615848 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.766820 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdsv9\" (UniqueName: \"kubernetes.io/projected/185fa21c-049f-41b3-9031-318a3c21ecef-kube-api-access-qdsv9\") pod \"185fa21c-049f-41b3-9031-318a3c21ecef\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.767090 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-catalog-content\") pod \"185fa21c-049f-41b3-9031-318a3c21ecef\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.767153 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-utilities\") pod \"185fa21c-049f-41b3-9031-318a3c21ecef\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.770117 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-utilities" (OuterVolumeSpecName: "utilities") pod "185fa21c-049f-41b3-9031-318a3c21ecef" (UID: "185fa21c-049f-41b3-9031-318a3c21ecef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.775256 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/185fa21c-049f-41b3-9031-318a3c21ecef-kube-api-access-qdsv9" (OuterVolumeSpecName: "kube-api-access-qdsv9") pod "185fa21c-049f-41b3-9031-318a3c21ecef" (UID: "185fa21c-049f-41b3-9031-318a3c21ecef"). InnerVolumeSpecName "kube-api-access-qdsv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.869592 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.869638 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdsv9\" (UniqueName: \"kubernetes.io/projected/185fa21c-049f-41b3-9031-318a3c21ecef-kube-api-access-qdsv9\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.895037 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "185fa21c-049f-41b3-9031-318a3c21ecef" (UID: "185fa21c-049f-41b3-9031-318a3c21ecef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.971675 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:53 crc kubenswrapper[4804]: I0217 14:06:53.535628 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:53 crc kubenswrapper[4804]: I0217 14:06:53.573686 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lhfth"] Feb 17 14:06:53 crc kubenswrapper[4804]: I0217 14:06:53.586921 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lhfth"] Feb 17 14:06:54 crc kubenswrapper[4804]: I0217 14:06:54.587468 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="185fa21c-049f-41b3-9031-318a3c21ecef" path="/var/lib/kubelet/pods/185fa21c-049f-41b3-9031-318a3c21ecef/volumes" Feb 17 14:07:03 crc kubenswrapper[4804]: I0217 14:07:03.574435 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:07:03 crc kubenswrapper[4804]: E0217 14:07:03.575379 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:07:15 crc kubenswrapper[4804]: I0217 14:07:15.574507 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:07:15 crc kubenswrapper[4804]: E0217 14:07:15.575808 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:07:25 crc kubenswrapper[4804]: I0217 14:07:25.823021 4804 generic.go:334] "Generic (PLEG): container finished" podID="9f17dd92-0402-40c7-bdc7-50b38e37f750" containerID="dbda66ccca14c400ac04b20a535081d1e040f266a1132798c4aceb72485b84fa" exitCode=0 Feb 17 14:07:25 crc kubenswrapper[4804]: I0217 14:07:25.823154 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" event={"ID":"9f17dd92-0402-40c7-bdc7-50b38e37f750","Type":"ContainerDied","Data":"dbda66ccca14c400ac04b20a535081d1e040f266a1132798c4aceb72485b84fa"} Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.227239 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.274587 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-inventory\") pod \"9f17dd92-0402-40c7-bdc7-50b38e37f750\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.274657 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-combined-ca-bundle\") pod \"9f17dd92-0402-40c7-bdc7-50b38e37f750\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.274678 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-ssh-key-openstack-edpm-ipam\") pod \"9f17dd92-0402-40c7-bdc7-50b38e37f750\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.274725 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-extra-config-0\") pod \"9f17dd92-0402-40c7-bdc7-50b38e37f750\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.274760 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-0\") pod \"9f17dd92-0402-40c7-bdc7-50b38e37f750\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.274819 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gv28f\" (UniqueName: \"kubernetes.io/projected/9f17dd92-0402-40c7-bdc7-50b38e37f750-kube-api-access-gv28f\") pod \"9f17dd92-0402-40c7-bdc7-50b38e37f750\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.274847 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-1\") pod \"9f17dd92-0402-40c7-bdc7-50b38e37f750\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.274871 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-0\") pod \"9f17dd92-0402-40c7-bdc7-50b38e37f750\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.274900 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-1\") pod \"9f17dd92-0402-40c7-bdc7-50b38e37f750\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.280446 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9f17dd92-0402-40c7-bdc7-50b38e37f750" (UID: "9f17dd92-0402-40c7-bdc7-50b38e37f750"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.280678 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f17dd92-0402-40c7-bdc7-50b38e37f750-kube-api-access-gv28f" (OuterVolumeSpecName: "kube-api-access-gv28f") pod "9f17dd92-0402-40c7-bdc7-50b38e37f750" (UID: "9f17dd92-0402-40c7-bdc7-50b38e37f750"). InnerVolumeSpecName "kube-api-access-gv28f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.307022 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-inventory" (OuterVolumeSpecName: "inventory") pod "9f17dd92-0402-40c7-bdc7-50b38e37f750" (UID: "9f17dd92-0402-40c7-bdc7-50b38e37f750"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.307538 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "9f17dd92-0402-40c7-bdc7-50b38e37f750" (UID: "9f17dd92-0402-40c7-bdc7-50b38e37f750"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.308837 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "9f17dd92-0402-40c7-bdc7-50b38e37f750" (UID: "9f17dd92-0402-40c7-bdc7-50b38e37f750"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.310403 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "9f17dd92-0402-40c7-bdc7-50b38e37f750" (UID: "9f17dd92-0402-40c7-bdc7-50b38e37f750"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.311500 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "9f17dd92-0402-40c7-bdc7-50b38e37f750" (UID: "9f17dd92-0402-40c7-bdc7-50b38e37f750"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.315404 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9f17dd92-0402-40c7-bdc7-50b38e37f750" (UID: "9f17dd92-0402-40c7-bdc7-50b38e37f750"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.319453 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "9f17dd92-0402-40c7-bdc7-50b38e37f750" (UID: "9f17dd92-0402-40c7-bdc7-50b38e37f750"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.376849 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.377171 4804 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.377323 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.377513 4804 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.377664 4804 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.377758 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gv28f\" (UniqueName: \"kubernetes.io/projected/9f17dd92-0402-40c7-bdc7-50b38e37f750-kube-api-access-gv28f\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.377851 4804 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.377947 4804 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.378004 4804 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.575339 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:07:27 crc kubenswrapper[4804]: E0217 14:07:27.576094 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.844668 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" event={"ID":"9f17dd92-0402-40c7-bdc7-50b38e37f750","Type":"ContainerDied","Data":"3737f6f0131a3f5e82616cb8a9012b910af27541cdeac5c9048e6ea1b4d2299d"} Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.844715 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3737f6f0131a3f5e82616cb8a9012b910af27541cdeac5c9048e6ea1b4d2299d" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.844755 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.935988 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55"] Feb 17 14:07:27 crc kubenswrapper[4804]: E0217 14:07:27.936432 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="185fa21c-049f-41b3-9031-318a3c21ecef" containerName="extract-utilities" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.936452 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="185fa21c-049f-41b3-9031-318a3c21ecef" containerName="extract-utilities" Feb 17 14:07:27 crc kubenswrapper[4804]: E0217 14:07:27.936463 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="185fa21c-049f-41b3-9031-318a3c21ecef" containerName="registry-server" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.936472 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="185fa21c-049f-41b3-9031-318a3c21ecef" containerName="registry-server" Feb 17 14:07:27 crc kubenswrapper[4804]: E0217 14:07:27.936496 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f17dd92-0402-40c7-bdc7-50b38e37f750" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.936504 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f17dd92-0402-40c7-bdc7-50b38e37f750" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 17 14:07:27 crc kubenswrapper[4804]: E0217 14:07:27.936539 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="185fa21c-049f-41b3-9031-318a3c21ecef" containerName="extract-content" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.936546 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="185fa21c-049f-41b3-9031-318a3c21ecef" containerName="extract-content" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.936782 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="185fa21c-049f-41b3-9031-318a3c21ecef" containerName="registry-server" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.936803 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f17dd92-0402-40c7-bdc7-50b38e37f750" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.937532 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.939779 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.939922 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.939934 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.940013 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.940256 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.955244 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55"] Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.988070 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.988131 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.988166 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.988227 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnn4c\" (UniqueName: \"kubernetes.io/projected/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-kube-api-access-cnn4c\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.988260 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.988416 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.988629 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.089985 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnn4c\" (UniqueName: \"kubernetes.io/projected/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-kube-api-access-cnn4c\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.090059 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.090114 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.090217 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.090293 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.090319 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.090355 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.095865 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.096063 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.096472 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.096874 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.097274 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.097485 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.108548 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnn4c\" (UniqueName: \"kubernetes.io/projected/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-kube-api-access-cnn4c\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.258372 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.760930 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55"] Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.853853 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" event={"ID":"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7","Type":"ContainerStarted","Data":"d5c61072e65010bf5df8b12c5d629af6bb39700b5930687931000fc84258080e"} Feb 17 14:07:29 crc kubenswrapper[4804]: I0217 14:07:29.867396 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" event={"ID":"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7","Type":"ContainerStarted","Data":"27e33d208e79739a22cd976d57adcf8d08a28d6918b4dc63998a98c640c7b7d3"} Feb 17 14:07:42 crc kubenswrapper[4804]: I0217 14:07:42.575718 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:07:42 crc kubenswrapper[4804]: E0217 14:07:42.576490 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:07:57 crc kubenswrapper[4804]: I0217 14:07:57.574403 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:07:57 crc kubenswrapper[4804]: E0217 14:07:57.575138 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:08:10 crc kubenswrapper[4804]: I0217 14:08:10.574797 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:08:10 crc kubenswrapper[4804]: E0217 14:08:10.575591 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:08:21 crc kubenswrapper[4804]: I0217 14:08:21.574300 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:08:21 crc kubenswrapper[4804]: E0217 14:08:21.575040 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:08:35 crc kubenswrapper[4804]: I0217 14:08:35.574276 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:08:35 crc kubenswrapper[4804]: E0217 14:08:35.574931 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:08:48 crc kubenswrapper[4804]: I0217 14:08:48.574709 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:08:48 crc kubenswrapper[4804]: E0217 14:08:48.575611 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:09:00 crc kubenswrapper[4804]: I0217 14:09:00.591273 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:09:00 crc kubenswrapper[4804]: E0217 14:09:00.592376 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:09:13 crc kubenswrapper[4804]: I0217 14:09:13.574689 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:09:13 crc kubenswrapper[4804]: E0217 14:09:13.575467 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:09:24 crc kubenswrapper[4804]: I0217 14:09:24.574731 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:09:24 crc kubenswrapper[4804]: E0217 14:09:24.575634 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:09:39 crc kubenswrapper[4804]: I0217 14:09:39.574468 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:09:39 crc kubenswrapper[4804]: E0217 14:09:39.575120 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:09:41 crc kubenswrapper[4804]: I0217 14:09:41.238877 4804 generic.go:334] "Generic (PLEG): container finished" podID="0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7" containerID="27e33d208e79739a22cd976d57adcf8d08a28d6918b4dc63998a98c640c7b7d3" exitCode=0 Feb 17 14:09:41 crc kubenswrapper[4804]: I0217 14:09:41.238975 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" event={"ID":"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7","Type":"ContainerDied","Data":"27e33d208e79739a22cd976d57adcf8d08a28d6918b4dc63998a98c640c7b7d3"} Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.707884 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.789568 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-inventory\") pod \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.789614 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-0\") pod \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.789682 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnn4c\" (UniqueName: \"kubernetes.io/projected/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-kube-api-access-cnn4c\") pod \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.789806 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-1\") pod \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.789836 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-telemetry-combined-ca-bundle\") pod \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.789883 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-2\") pod \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.789912 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ssh-key-openstack-edpm-ipam\") pod \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.807429 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7" (UID: "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.811392 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-kube-api-access-cnn4c" (OuterVolumeSpecName: "kube-api-access-cnn4c") pod "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7" (UID: "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7"). InnerVolumeSpecName "kube-api-access-cnn4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.820006 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7" (UID: "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.821623 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7" (UID: "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.826640 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7" (UID: "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.827275 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7" (UID: "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.829490 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-inventory" (OuterVolumeSpecName: "inventory") pod "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7" (UID: "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.892407 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnn4c\" (UniqueName: \"kubernetes.io/projected/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-kube-api-access-cnn4c\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.892660 4804 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.892673 4804 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.892686 4804 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.892695 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.892705 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.892713 4804 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:43 crc kubenswrapper[4804]: I0217 14:09:43.260588 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" event={"ID":"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7","Type":"ContainerDied","Data":"d5c61072e65010bf5df8b12c5d629af6bb39700b5930687931000fc84258080e"} Feb 17 14:09:43 crc kubenswrapper[4804]: I0217 14:09:43.260637 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5c61072e65010bf5df8b12c5d629af6bb39700b5930687931000fc84258080e" Feb 17 14:09:43 crc kubenswrapper[4804]: I0217 14:09:43.260643 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:09:53 crc kubenswrapper[4804]: I0217 14:09:53.574097 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:09:53 crc kubenswrapper[4804]: E0217 14:09:53.574665 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:10:08 crc kubenswrapper[4804]: I0217 14:10:08.574190 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:10:08 crc kubenswrapper[4804]: E0217 14:10:08.575484 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:10:23 crc kubenswrapper[4804]: I0217 14:10:23.574092 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:10:23 crc kubenswrapper[4804]: E0217 14:10:23.574859 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.439106 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 17 14:10:26 crc kubenswrapper[4804]: E0217 14:10:26.439817 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.439831 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.440014 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.440685 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.443536 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.444411 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.444858 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.445730 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-kssn4" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.454556 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.546635 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.546695 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-548bt\" (UniqueName: \"kubernetes.io/projected/f7b246dc-1d07-4725-b471-88fe82584d24-kube-api-access-548bt\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.546730 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.546837 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-config-data\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.546877 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.547077 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.547181 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.547357 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.547547 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.648838 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.648911 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.648978 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-548bt\" (UniqueName: \"kubernetes.io/projected/f7b246dc-1d07-4725-b471-88fe82584d24-kube-api-access-548bt\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.649019 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.649046 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-config-data\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.649066 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.649118 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.649148 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.649191 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.649653 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.649649 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.649983 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.651572 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.655506 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.655681 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.657232 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.660597 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-config-data\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.661671 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.663398 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.666889 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-548bt\" (UniqueName: \"kubernetes.io/projected/f7b246dc-1d07-4725-b471-88fe82584d24-kube-api-access-548bt\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.680717 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.779519 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-kssn4" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.787536 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 17 14:10:27 crc kubenswrapper[4804]: I0217 14:10:27.231165 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 17 14:10:27 crc kubenswrapper[4804]: I0217 14:10:27.668626 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"f7b246dc-1d07-4725-b471-88fe82584d24","Type":"ContainerStarted","Data":"35721c59346596c631486087761565338b01be7cc9c8b0659285af567a265321"} Feb 17 14:10:35 crc kubenswrapper[4804]: I0217 14:10:35.575127 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:10:36 crc kubenswrapper[4804]: I0217 14:10:36.768971 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"3270aad0cb169aa901bc212cfebf2f76e758028c073b7389992ac0738318b723"} Feb 17 14:11:04 crc kubenswrapper[4804]: E0217 14:11:04.550837 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Feb 17 14:11:04 crc kubenswrapper[4804]: E0217 14:11:04.551455 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-548bt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(f7b246dc-1d07-4725-b471-88fe82584d24): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:11:04 crc kubenswrapper[4804]: E0217 14:11:04.552737 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="f7b246dc-1d07-4725-b471-88fe82584d24" Feb 17 14:11:05 crc kubenswrapper[4804]: E0217 14:11:05.068441 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="f7b246dc-1d07-4725-b471-88fe82584d24" Feb 17 14:11:17 crc kubenswrapper[4804]: I0217 14:11:17.037078 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 17 14:11:18 crc kubenswrapper[4804]: I0217 14:11:18.209186 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"f7b246dc-1d07-4725-b471-88fe82584d24","Type":"ContainerStarted","Data":"d537c8e502573d470d3444dc025ba077411e9d8c16e3d0c7fcbea501f31e4c98"} Feb 17 14:11:18 crc kubenswrapper[4804]: I0217 14:11:18.238088 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.446061712 podStartE2EDuration="53.238065479s" podCreationTimestamp="2026-02-17 14:10:25 +0000 UTC" firstStartedPulling="2026-02-17 14:10:27.241142569 +0000 UTC m=+2701.352561906" lastFinishedPulling="2026-02-17 14:11:17.033146336 +0000 UTC m=+2751.144565673" observedRunningTime="2026-02-17 14:11:18.225375165 +0000 UTC m=+2752.336794502" watchObservedRunningTime="2026-02-17 14:11:18.238065479 +0000 UTC m=+2752.349484816" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.140605 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v6fhg"] Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.143432 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.156154 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6fhg"] Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.283136 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ml5d\" (UniqueName: \"kubernetes.io/projected/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-kube-api-access-5ml5d\") pod \"redhat-marketplace-v6fhg\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.283190 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-utilities\") pod \"redhat-marketplace-v6fhg\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.283707 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-catalog-content\") pod \"redhat-marketplace-v6fhg\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.386234 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-utilities\") pod \"redhat-marketplace-v6fhg\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.386291 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ml5d\" (UniqueName: \"kubernetes.io/projected/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-kube-api-access-5ml5d\") pod \"redhat-marketplace-v6fhg\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.386444 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-catalog-content\") pod \"redhat-marketplace-v6fhg\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.386892 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-utilities\") pod \"redhat-marketplace-v6fhg\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.386925 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-catalog-content\") pod \"redhat-marketplace-v6fhg\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.418175 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ml5d\" (UniqueName: \"kubernetes.io/projected/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-kube-api-access-5ml5d\") pod \"redhat-marketplace-v6fhg\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.482249 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.935642 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6fhg"] Feb 17 14:12:31 crc kubenswrapper[4804]: I0217 14:12:31.871857 4804 generic.go:334] "Generic (PLEG): container finished" podID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" containerID="a6e80dd12b575e040cf83707a8f17bbbd44006a931291c56034df5502fa278d3" exitCode=0 Feb 17 14:12:31 crc kubenswrapper[4804]: I0217 14:12:31.871964 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6fhg" event={"ID":"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1","Type":"ContainerDied","Data":"a6e80dd12b575e040cf83707a8f17bbbd44006a931291c56034df5502fa278d3"} Feb 17 14:12:31 crc kubenswrapper[4804]: I0217 14:12:31.872243 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6fhg" event={"ID":"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1","Type":"ContainerStarted","Data":"fc548041e26d0614531ae99cf8e30f06221c5f4d8be0ee5276ce2c338d7913a8"} Feb 17 14:12:31 crc kubenswrapper[4804]: I0217 14:12:31.875989 4804 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:12:33 crc kubenswrapper[4804]: I0217 14:12:33.893537 4804 generic.go:334] "Generic (PLEG): container finished" podID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" containerID="cb1ba2ff1c2492a4aa44fe8ec7c0a001fed6d1e45c3c3350c681bd7e9cbf2734" exitCode=0 Feb 17 14:12:33 crc kubenswrapper[4804]: I0217 14:12:33.893778 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6fhg" event={"ID":"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1","Type":"ContainerDied","Data":"cb1ba2ff1c2492a4aa44fe8ec7c0a001fed6d1e45c3c3350c681bd7e9cbf2734"} Feb 17 14:12:34 crc kubenswrapper[4804]: I0217 14:12:34.910134 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6fhg" event={"ID":"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1","Type":"ContainerStarted","Data":"6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913"} Feb 17 14:12:34 crc kubenswrapper[4804]: I0217 14:12:34.929387 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v6fhg" podStartSLOduration=2.2719784499999998 podStartE2EDuration="4.929370771s" podCreationTimestamp="2026-02-17 14:12:30 +0000 UTC" firstStartedPulling="2026-02-17 14:12:31.874518708 +0000 UTC m=+2825.985938085" lastFinishedPulling="2026-02-17 14:12:34.531911069 +0000 UTC m=+2828.643330406" observedRunningTime="2026-02-17 14:12:34.925780368 +0000 UTC m=+2829.037199705" watchObservedRunningTime="2026-02-17 14:12:34.929370771 +0000 UTC m=+2829.040790108" Feb 17 14:12:40 crc kubenswrapper[4804]: I0217 14:12:40.482825 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:40 crc kubenswrapper[4804]: I0217 14:12:40.483711 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:40 crc kubenswrapper[4804]: I0217 14:12:40.585360 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:41 crc kubenswrapper[4804]: I0217 14:12:41.023006 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:41 crc kubenswrapper[4804]: I0217 14:12:41.090850 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6fhg"] Feb 17 14:12:42 crc kubenswrapper[4804]: I0217 14:12:42.989252 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v6fhg" podUID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" containerName="registry-server" containerID="cri-o://6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913" gracePeriod=2 Feb 17 14:12:43 crc kubenswrapper[4804]: I0217 14:12:43.468119 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:43 crc kubenswrapper[4804]: I0217 14:12:43.550551 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ml5d\" (UniqueName: \"kubernetes.io/projected/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-kube-api-access-5ml5d\") pod \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " Feb 17 14:12:43 crc kubenswrapper[4804]: I0217 14:12:43.550667 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-utilities\") pod \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " Feb 17 14:12:43 crc kubenswrapper[4804]: I0217 14:12:43.550718 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-catalog-content\") pod \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " Feb 17 14:12:43 crc kubenswrapper[4804]: I0217 14:12:43.551302 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-utilities" (OuterVolumeSpecName: "utilities") pod "8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" (UID: "8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:12:43 crc kubenswrapper[4804]: I0217 14:12:43.556658 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-kube-api-access-5ml5d" (OuterVolumeSpecName: "kube-api-access-5ml5d") pod "8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" (UID: "8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1"). InnerVolumeSpecName "kube-api-access-5ml5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:12:43 crc kubenswrapper[4804]: I0217 14:12:43.573461 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" (UID: "8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:12:43 crc kubenswrapper[4804]: I0217 14:12:43.653224 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ml5d\" (UniqueName: \"kubernetes.io/projected/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-kube-api-access-5ml5d\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:43 crc kubenswrapper[4804]: I0217 14:12:43.653265 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:43 crc kubenswrapper[4804]: I0217 14:12:43.653279 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.005475 4804 generic.go:334] "Generic (PLEG): container finished" podID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" containerID="6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913" exitCode=0 Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.005521 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6fhg" event={"ID":"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1","Type":"ContainerDied","Data":"6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913"} Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.005550 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6fhg" event={"ID":"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1","Type":"ContainerDied","Data":"fc548041e26d0614531ae99cf8e30f06221c5f4d8be0ee5276ce2c338d7913a8"} Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.005566 4804 scope.go:117] "RemoveContainer" containerID="6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913" Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.005697 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.025672 4804 scope.go:117] "RemoveContainer" containerID="cb1ba2ff1c2492a4aa44fe8ec7c0a001fed6d1e45c3c3350c681bd7e9cbf2734" Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.058087 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6fhg"] Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.071046 4804 scope.go:117] "RemoveContainer" containerID="a6e80dd12b575e040cf83707a8f17bbbd44006a931291c56034df5502fa278d3" Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.071063 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6fhg"] Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.104480 4804 scope.go:117] "RemoveContainer" containerID="6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913" Feb 17 14:12:44 crc kubenswrapper[4804]: E0217 14:12:44.105124 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913\": container with ID starting with 6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913 not found: ID does not exist" containerID="6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913" Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.105229 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913"} err="failed to get container status \"6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913\": rpc error: code = NotFound desc = could not find container \"6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913\": container with ID starting with 6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913 not found: ID does not exist" Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.105291 4804 scope.go:117] "RemoveContainer" containerID="cb1ba2ff1c2492a4aa44fe8ec7c0a001fed6d1e45c3c3350c681bd7e9cbf2734" Feb 17 14:12:44 crc kubenswrapper[4804]: E0217 14:12:44.105754 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb1ba2ff1c2492a4aa44fe8ec7c0a001fed6d1e45c3c3350c681bd7e9cbf2734\": container with ID starting with cb1ba2ff1c2492a4aa44fe8ec7c0a001fed6d1e45c3c3350c681bd7e9cbf2734 not found: ID does not exist" containerID="cb1ba2ff1c2492a4aa44fe8ec7c0a001fed6d1e45c3c3350c681bd7e9cbf2734" Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.105801 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1ba2ff1c2492a4aa44fe8ec7c0a001fed6d1e45c3c3350c681bd7e9cbf2734"} err="failed to get container status \"cb1ba2ff1c2492a4aa44fe8ec7c0a001fed6d1e45c3c3350c681bd7e9cbf2734\": rpc error: code = NotFound desc = could not find container \"cb1ba2ff1c2492a4aa44fe8ec7c0a001fed6d1e45c3c3350c681bd7e9cbf2734\": container with ID starting with cb1ba2ff1c2492a4aa44fe8ec7c0a001fed6d1e45c3c3350c681bd7e9cbf2734 not found: ID does not exist" Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.105834 4804 scope.go:117] "RemoveContainer" containerID="a6e80dd12b575e040cf83707a8f17bbbd44006a931291c56034df5502fa278d3" Feb 17 14:12:44 crc kubenswrapper[4804]: E0217 14:12:44.106171 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6e80dd12b575e040cf83707a8f17bbbd44006a931291c56034df5502fa278d3\": container with ID starting with a6e80dd12b575e040cf83707a8f17bbbd44006a931291c56034df5502fa278d3 not found: ID does not exist" containerID="a6e80dd12b575e040cf83707a8f17bbbd44006a931291c56034df5502fa278d3" Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.106237 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6e80dd12b575e040cf83707a8f17bbbd44006a931291c56034df5502fa278d3"} err="failed to get container status \"a6e80dd12b575e040cf83707a8f17bbbd44006a931291c56034df5502fa278d3\": rpc error: code = NotFound desc = could not find container \"a6e80dd12b575e040cf83707a8f17bbbd44006a931291c56034df5502fa278d3\": container with ID starting with a6e80dd12b575e040cf83707a8f17bbbd44006a931291c56034df5502fa278d3 not found: ID does not exist" Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.584829 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" path="/var/lib/kubelet/pods/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1/volumes" Feb 17 14:12:55 crc kubenswrapper[4804]: I0217 14:12:55.835295 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:12:55 crc kubenswrapper[4804]: I0217 14:12:55.835893 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:13:25 crc kubenswrapper[4804]: I0217 14:13:25.835715 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:13:25 crc kubenswrapper[4804]: I0217 14:13:25.836365 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:13:31 crc kubenswrapper[4804]: I0217 14:13:31.594462 4804 scope.go:117] "RemoveContainer" containerID="63d2f999f3b20ab902c761ba7009b771de4228c739dc86596893270097ddcd50" Feb 17 14:13:31 crc kubenswrapper[4804]: I0217 14:13:31.624323 4804 scope.go:117] "RemoveContainer" containerID="a4f78e134f4672850b3042a62c45a06ef12ddc5102c5309d4d677ecae6fb3127" Feb 17 14:13:31 crc kubenswrapper[4804]: I0217 14:13:31.690546 4804 scope.go:117] "RemoveContainer" containerID="f01193bdf1e82e2a6e3fea01bf103d3eb3adc6c861f2b2ac1692bb87fc0b6c46" Feb 17 14:13:55 crc kubenswrapper[4804]: I0217 14:13:55.835076 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:13:55 crc kubenswrapper[4804]: I0217 14:13:55.835610 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:13:55 crc kubenswrapper[4804]: I0217 14:13:55.835651 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 14:13:55 crc kubenswrapper[4804]: I0217 14:13:55.836355 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3270aad0cb169aa901bc212cfebf2f76e758028c073b7389992ac0738318b723"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:13:55 crc kubenswrapper[4804]: I0217 14:13:55.836403 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://3270aad0cb169aa901bc212cfebf2f76e758028c073b7389992ac0738318b723" gracePeriod=600 Feb 17 14:13:56 crc kubenswrapper[4804]: I0217 14:13:56.674865 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="3270aad0cb169aa901bc212cfebf2f76e758028c073b7389992ac0738318b723" exitCode=0 Feb 17 14:13:56 crc kubenswrapper[4804]: I0217 14:13:56.675058 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"3270aad0cb169aa901bc212cfebf2f76e758028c073b7389992ac0738318b723"} Feb 17 14:13:56 crc kubenswrapper[4804]: I0217 14:13:56.675597 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce"} Feb 17 14:13:56 crc kubenswrapper[4804]: I0217 14:13:56.675621 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.006606 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bg24h"] Feb 17 14:14:08 crc kubenswrapper[4804]: E0217 14:14:08.016705 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" containerName="extract-content" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.016949 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" containerName="extract-content" Feb 17 14:14:08 crc kubenswrapper[4804]: E0217 14:14:08.017068 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" containerName="extract-utilities" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.017158 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" containerName="extract-utilities" Feb 17 14:14:08 crc kubenswrapper[4804]: E0217 14:14:08.017291 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" containerName="registry-server" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.017380 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" containerName="registry-server" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.017755 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" containerName="registry-server" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.019917 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.035136 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bg24h"] Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.187584 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-utilities\") pod \"community-operators-bg24h\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.188594 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c56fp\" (UniqueName: \"kubernetes.io/projected/ea983551-05ac-4386-8afb-4c1e289de6bd-kube-api-access-c56fp\") pod \"community-operators-bg24h\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.188648 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-catalog-content\") pod \"community-operators-bg24h\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.290322 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-utilities\") pod \"community-operators-bg24h\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.290455 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c56fp\" (UniqueName: \"kubernetes.io/projected/ea983551-05ac-4386-8afb-4c1e289de6bd-kube-api-access-c56fp\") pod \"community-operators-bg24h\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.290509 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-catalog-content\") pod \"community-operators-bg24h\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.290890 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-utilities\") pod \"community-operators-bg24h\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.291237 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-catalog-content\") pod \"community-operators-bg24h\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.320884 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c56fp\" (UniqueName: \"kubernetes.io/projected/ea983551-05ac-4386-8afb-4c1e289de6bd-kube-api-access-c56fp\") pod \"community-operators-bg24h\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.362147 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.894423 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bg24h"] Feb 17 14:14:09 crc kubenswrapper[4804]: I0217 14:14:09.792516 4804 generic.go:334] "Generic (PLEG): container finished" podID="ea983551-05ac-4386-8afb-4c1e289de6bd" containerID="ea87c1c20e3ee1bc10d2928dbe764d8d8ebee9c61ba8fd1bfd54a3df2486e65c" exitCode=0 Feb 17 14:14:09 crc kubenswrapper[4804]: I0217 14:14:09.792593 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bg24h" event={"ID":"ea983551-05ac-4386-8afb-4c1e289de6bd","Type":"ContainerDied","Data":"ea87c1c20e3ee1bc10d2928dbe764d8d8ebee9c61ba8fd1bfd54a3df2486e65c"} Feb 17 14:14:09 crc kubenswrapper[4804]: I0217 14:14:09.792909 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bg24h" event={"ID":"ea983551-05ac-4386-8afb-4c1e289de6bd","Type":"ContainerStarted","Data":"22b93f76de275c61d7af6af439b9be25047ded458855de738f24cea5fd962af2"} Feb 17 14:14:10 crc kubenswrapper[4804]: I0217 14:14:10.803810 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bg24h" event={"ID":"ea983551-05ac-4386-8afb-4c1e289de6bd","Type":"ContainerStarted","Data":"bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a"} Feb 17 14:14:11 crc kubenswrapper[4804]: I0217 14:14:11.816740 4804 generic.go:334] "Generic (PLEG): container finished" podID="ea983551-05ac-4386-8afb-4c1e289de6bd" containerID="bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a" exitCode=0 Feb 17 14:14:11 crc kubenswrapper[4804]: I0217 14:14:11.816850 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bg24h" event={"ID":"ea983551-05ac-4386-8afb-4c1e289de6bd","Type":"ContainerDied","Data":"bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a"} Feb 17 14:14:12 crc kubenswrapper[4804]: I0217 14:14:12.839010 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bg24h" event={"ID":"ea983551-05ac-4386-8afb-4c1e289de6bd","Type":"ContainerStarted","Data":"2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d"} Feb 17 14:14:12 crc kubenswrapper[4804]: I0217 14:14:12.859328 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bg24h" podStartSLOduration=3.407514602 podStartE2EDuration="5.859306179s" podCreationTimestamp="2026-02-17 14:14:07 +0000 UTC" firstStartedPulling="2026-02-17 14:14:09.795505748 +0000 UTC m=+2923.906925105" lastFinishedPulling="2026-02-17 14:14:12.247297345 +0000 UTC m=+2926.358716682" observedRunningTime="2026-02-17 14:14:12.856961446 +0000 UTC m=+2926.968380793" watchObservedRunningTime="2026-02-17 14:14:12.859306179 +0000 UTC m=+2926.970725506" Feb 17 14:14:18 crc kubenswrapper[4804]: I0217 14:14:18.362698 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:18 crc kubenswrapper[4804]: I0217 14:14:18.363339 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:18 crc kubenswrapper[4804]: I0217 14:14:18.410716 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:18 crc kubenswrapper[4804]: I0217 14:14:18.940399 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:18 crc kubenswrapper[4804]: I0217 14:14:18.994106 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bg24h"] Feb 17 14:14:20 crc kubenswrapper[4804]: I0217 14:14:20.913290 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bg24h" podUID="ea983551-05ac-4386-8afb-4c1e289de6bd" containerName="registry-server" containerID="cri-o://2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d" gracePeriod=2 Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.407783 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.438379 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c56fp\" (UniqueName: \"kubernetes.io/projected/ea983551-05ac-4386-8afb-4c1e289de6bd-kube-api-access-c56fp\") pod \"ea983551-05ac-4386-8afb-4c1e289de6bd\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.438690 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-utilities\") pod \"ea983551-05ac-4386-8afb-4c1e289de6bd\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.438917 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-catalog-content\") pod \"ea983551-05ac-4386-8afb-4c1e289de6bd\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.445500 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea983551-05ac-4386-8afb-4c1e289de6bd-kube-api-access-c56fp" (OuterVolumeSpecName: "kube-api-access-c56fp") pod "ea983551-05ac-4386-8afb-4c1e289de6bd" (UID: "ea983551-05ac-4386-8afb-4c1e289de6bd"). InnerVolumeSpecName "kube-api-access-c56fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.448008 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-utilities" (OuterVolumeSpecName: "utilities") pod "ea983551-05ac-4386-8afb-4c1e289de6bd" (UID: "ea983551-05ac-4386-8afb-4c1e289de6bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.502588 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea983551-05ac-4386-8afb-4c1e289de6bd" (UID: "ea983551-05ac-4386-8afb-4c1e289de6bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.541163 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c56fp\" (UniqueName: \"kubernetes.io/projected/ea983551-05ac-4386-8afb-4c1e289de6bd-kube-api-access-c56fp\") on node \"crc\" DevicePath \"\"" Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.541218 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.541233 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.926334 4804 generic.go:334] "Generic (PLEG): container finished" podID="ea983551-05ac-4386-8afb-4c1e289de6bd" containerID="2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d" exitCode=0 Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.926402 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bg24h" event={"ID":"ea983551-05ac-4386-8afb-4c1e289de6bd","Type":"ContainerDied","Data":"2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d"} Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.926445 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bg24h" event={"ID":"ea983551-05ac-4386-8afb-4c1e289de6bd","Type":"ContainerDied","Data":"22b93f76de275c61d7af6af439b9be25047ded458855de738f24cea5fd962af2"} Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.926474 4804 scope.go:117] "RemoveContainer" containerID="2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d" Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.926648 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.951640 4804 scope.go:117] "RemoveContainer" containerID="bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a" Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.985502 4804 scope.go:117] "RemoveContainer" containerID="ea87c1c20e3ee1bc10d2928dbe764d8d8ebee9c61ba8fd1bfd54a3df2486e65c" Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.986118 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bg24h"] Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.994279 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bg24h"] Feb 17 14:14:22 crc kubenswrapper[4804]: I0217 14:14:22.031840 4804 scope.go:117] "RemoveContainer" containerID="2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d" Feb 17 14:14:22 crc kubenswrapper[4804]: E0217 14:14:22.032468 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d\": container with ID starting with 2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d not found: ID does not exist" containerID="2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d" Feb 17 14:14:22 crc kubenswrapper[4804]: I0217 14:14:22.032620 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d"} err="failed to get container status \"2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d\": rpc error: code = NotFound desc = could not find container \"2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d\": container with ID starting with 2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d not found: ID does not exist" Feb 17 14:14:22 crc kubenswrapper[4804]: I0217 14:14:22.032757 4804 scope.go:117] "RemoveContainer" containerID="bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a" Feb 17 14:14:22 crc kubenswrapper[4804]: E0217 14:14:22.033430 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a\": container with ID starting with bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a not found: ID does not exist" containerID="bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a" Feb 17 14:14:22 crc kubenswrapper[4804]: I0217 14:14:22.033501 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a"} err="failed to get container status \"bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a\": rpc error: code = NotFound desc = could not find container \"bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a\": container with ID starting with bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a not found: ID does not exist" Feb 17 14:14:22 crc kubenswrapper[4804]: I0217 14:14:22.033538 4804 scope.go:117] "RemoveContainer" containerID="ea87c1c20e3ee1bc10d2928dbe764d8d8ebee9c61ba8fd1bfd54a3df2486e65c" Feb 17 14:14:22 crc kubenswrapper[4804]: E0217 14:14:22.033919 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea87c1c20e3ee1bc10d2928dbe764d8d8ebee9c61ba8fd1bfd54a3df2486e65c\": container with ID starting with ea87c1c20e3ee1bc10d2928dbe764d8d8ebee9c61ba8fd1bfd54a3df2486e65c not found: ID does not exist" containerID="ea87c1c20e3ee1bc10d2928dbe764d8d8ebee9c61ba8fd1bfd54a3df2486e65c" Feb 17 14:14:22 crc kubenswrapper[4804]: I0217 14:14:22.034008 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea87c1c20e3ee1bc10d2928dbe764d8d8ebee9c61ba8fd1bfd54a3df2486e65c"} err="failed to get container status \"ea87c1c20e3ee1bc10d2928dbe764d8d8ebee9c61ba8fd1bfd54a3df2486e65c\": rpc error: code = NotFound desc = could not find container \"ea87c1c20e3ee1bc10d2928dbe764d8d8ebee9c61ba8fd1bfd54a3df2486e65c\": container with ID starting with ea87c1c20e3ee1bc10d2928dbe764d8d8ebee9c61ba8fd1bfd54a3df2486e65c not found: ID does not exist" Feb 17 14:14:22 crc kubenswrapper[4804]: I0217 14:14:22.594871 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea983551-05ac-4386-8afb-4c1e289de6bd" path="/var/lib/kubelet/pods/ea983551-05ac-4386-8afb-4c1e289de6bd/volumes" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.150708 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp"] Feb 17 14:15:00 crc kubenswrapper[4804]: E0217 14:15:00.152857 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea983551-05ac-4386-8afb-4c1e289de6bd" containerName="extract-utilities" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.152953 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea983551-05ac-4386-8afb-4c1e289de6bd" containerName="extract-utilities" Feb 17 14:15:00 crc kubenswrapper[4804]: E0217 14:15:00.153034 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea983551-05ac-4386-8afb-4c1e289de6bd" containerName="registry-server" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.153122 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea983551-05ac-4386-8afb-4c1e289de6bd" containerName="registry-server" Feb 17 14:15:00 crc kubenswrapper[4804]: E0217 14:15:00.153193 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea983551-05ac-4386-8afb-4c1e289de6bd" containerName="extract-content" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.153271 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea983551-05ac-4386-8afb-4c1e289de6bd" containerName="extract-content" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.153643 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea983551-05ac-4386-8afb-4c1e289de6bd" containerName="registry-server" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.154490 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.159691 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.160022 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.168264 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp"] Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.248249 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlj9q\" (UniqueName: \"kubernetes.io/projected/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-kube-api-access-qlj9q\") pod \"collect-profiles-29522295-lptcp\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.248651 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-config-volume\") pod \"collect-profiles-29522295-lptcp\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.248812 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-secret-volume\") pod \"collect-profiles-29522295-lptcp\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.350600 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlj9q\" (UniqueName: \"kubernetes.io/projected/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-kube-api-access-qlj9q\") pod \"collect-profiles-29522295-lptcp\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.350647 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-config-volume\") pod \"collect-profiles-29522295-lptcp\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.350723 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-secret-volume\") pod \"collect-profiles-29522295-lptcp\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.352080 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-config-volume\") pod \"collect-profiles-29522295-lptcp\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.360240 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-secret-volume\") pod \"collect-profiles-29522295-lptcp\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.367395 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlj9q\" (UniqueName: \"kubernetes.io/projected/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-kube-api-access-qlj9q\") pod \"collect-profiles-29522295-lptcp\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.473014 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.939028 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp"] Feb 17 14:15:01 crc kubenswrapper[4804]: I0217 14:15:01.637077 4804 generic.go:334] "Generic (PLEG): container finished" podID="c3e8a4e9-ee0a-4283-835f-de5a54c8136d" containerID="377a1a93986753ac71ee083bd27b66be5e6cf98f0f0c8284bcdc9bdc6c8b8e33" exitCode=0 Feb 17 14:15:01 crc kubenswrapper[4804]: I0217 14:15:01.637136 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" event={"ID":"c3e8a4e9-ee0a-4283-835f-de5a54c8136d","Type":"ContainerDied","Data":"377a1a93986753ac71ee083bd27b66be5e6cf98f0f0c8284bcdc9bdc6c8b8e33"} Feb 17 14:15:01 crc kubenswrapper[4804]: I0217 14:15:01.637412 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" event={"ID":"c3e8a4e9-ee0a-4283-835f-de5a54c8136d","Type":"ContainerStarted","Data":"d3ac1606b3844e4108d0ef4f9f435f220a5046c56eb0032070247a6cd4bbd90b"} Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.012455 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.102989 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-secret-volume\") pod \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.104292 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-config-volume\") pod \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.104374 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlj9q\" (UniqueName: \"kubernetes.io/projected/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-kube-api-access-qlj9q\") pod \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.105153 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-config-volume" (OuterVolumeSpecName: "config-volume") pod "c3e8a4e9-ee0a-4283-835f-de5a54c8136d" (UID: "c3e8a4e9-ee0a-4283-835f-de5a54c8136d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.105551 4804 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.109783 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c3e8a4e9-ee0a-4283-835f-de5a54c8136d" (UID: "c3e8a4e9-ee0a-4283-835f-de5a54c8136d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.110240 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-kube-api-access-qlj9q" (OuterVolumeSpecName: "kube-api-access-qlj9q") pod "c3e8a4e9-ee0a-4283-835f-de5a54c8136d" (UID: "c3e8a4e9-ee0a-4283-835f-de5a54c8136d"). InnerVolumeSpecName "kube-api-access-qlj9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.207284 4804 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.207324 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlj9q\" (UniqueName: \"kubernetes.io/projected/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-kube-api-access-qlj9q\") on node \"crc\" DevicePath \"\"" Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.653971 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" event={"ID":"c3e8a4e9-ee0a-4283-835f-de5a54c8136d","Type":"ContainerDied","Data":"d3ac1606b3844e4108d0ef4f9f435f220a5046c56eb0032070247a6cd4bbd90b"} Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.654252 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3ac1606b3844e4108d0ef4f9f435f220a5046c56eb0032070247a6cd4bbd90b" Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.654003 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:04 crc kubenswrapper[4804]: I0217 14:15:04.087614 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9"] Feb 17 14:15:04 crc kubenswrapper[4804]: I0217 14:15:04.096305 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9"] Feb 17 14:15:04 crc kubenswrapper[4804]: I0217 14:15:04.584945 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9f0ac4b-5b59-4ff9-92ba-54668fffef27" path="/var/lib/kubelet/pods/f9f0ac4b-5b59-4ff9-92ba-54668fffef27/volumes" Feb 17 14:15:31 crc kubenswrapper[4804]: I0217 14:15:31.821840 4804 scope.go:117] "RemoveContainer" containerID="c63647c4f782e7514611e89775cb3101cab0f160b6675c0b2e9972791cd22306" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.707039 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mhl95"] Feb 17 14:15:38 crc kubenswrapper[4804]: E0217 14:15:38.708734 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3e8a4e9-ee0a-4283-835f-de5a54c8136d" containerName="collect-profiles" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.708755 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3e8a4e9-ee0a-4283-835f-de5a54c8136d" containerName="collect-profiles" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.708975 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3e8a4e9-ee0a-4283-835f-de5a54c8136d" containerName="collect-profiles" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.712029 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.724041 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mhl95"] Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.787162 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-catalog-content\") pod \"certified-operators-mhl95\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.787820 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-utilities\") pod \"certified-operators-mhl95\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.787947 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hj8g\" (UniqueName: \"kubernetes.io/projected/66512154-e5a4-4d46-9d1b-a091a9f9631d-kube-api-access-7hj8g\") pod \"certified-operators-mhl95\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.889644 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-utilities\") pod \"certified-operators-mhl95\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.889745 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hj8g\" (UniqueName: \"kubernetes.io/projected/66512154-e5a4-4d46-9d1b-a091a9f9631d-kube-api-access-7hj8g\") pod \"certified-operators-mhl95\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.889875 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-catalog-content\") pod \"certified-operators-mhl95\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.890144 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-utilities\") pod \"certified-operators-mhl95\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.890307 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-catalog-content\") pod \"certified-operators-mhl95\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.918097 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hj8g\" (UniqueName: \"kubernetes.io/projected/66512154-e5a4-4d46-9d1b-a091a9f9631d-kube-api-access-7hj8g\") pod \"certified-operators-mhl95\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:39 crc kubenswrapper[4804]: I0217 14:15:39.053849 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:39 crc kubenswrapper[4804]: I0217 14:15:39.553669 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mhl95"] Feb 17 14:15:39 crc kubenswrapper[4804]: I0217 14:15:39.987512 4804 generic.go:334] "Generic (PLEG): container finished" podID="66512154-e5a4-4d46-9d1b-a091a9f9631d" containerID="9dfdcb0d9f1f2a6e6fee92b77d55f2bcf82e3cb7aa6edcaeca0ef57b91dae36a" exitCode=0 Feb 17 14:15:39 crc kubenswrapper[4804]: I0217 14:15:39.987637 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhl95" event={"ID":"66512154-e5a4-4d46-9d1b-a091a9f9631d","Type":"ContainerDied","Data":"9dfdcb0d9f1f2a6e6fee92b77d55f2bcf82e3cb7aa6edcaeca0ef57b91dae36a"} Feb 17 14:15:39 crc kubenswrapper[4804]: I0217 14:15:39.987837 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhl95" event={"ID":"66512154-e5a4-4d46-9d1b-a091a9f9631d","Type":"ContainerStarted","Data":"75a2dce20291625dd97df834d1901fc5bd4bff2bee391da393378fee4ed223cb"} Feb 17 14:15:42 crc kubenswrapper[4804]: I0217 14:15:42.013086 4804 generic.go:334] "Generic (PLEG): container finished" podID="66512154-e5a4-4d46-9d1b-a091a9f9631d" containerID="fc7a42701f50553fc13c96957a701885c01b685335b98aadf5b0ab6df73eda0d" exitCode=0 Feb 17 14:15:42 crc kubenswrapper[4804]: I0217 14:15:42.013183 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhl95" event={"ID":"66512154-e5a4-4d46-9d1b-a091a9f9631d","Type":"ContainerDied","Data":"fc7a42701f50553fc13c96957a701885c01b685335b98aadf5b0ab6df73eda0d"} Feb 17 14:15:43 crc kubenswrapper[4804]: I0217 14:15:43.024155 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhl95" event={"ID":"66512154-e5a4-4d46-9d1b-a091a9f9631d","Type":"ContainerStarted","Data":"18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa"} Feb 17 14:15:43 crc kubenswrapper[4804]: I0217 14:15:43.044914 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mhl95" podStartSLOduration=2.524710135 podStartE2EDuration="5.044897066s" podCreationTimestamp="2026-02-17 14:15:38 +0000 UTC" firstStartedPulling="2026-02-17 14:15:39.98924274 +0000 UTC m=+3014.100662077" lastFinishedPulling="2026-02-17 14:15:42.509429671 +0000 UTC m=+3016.620849008" observedRunningTime="2026-02-17 14:15:43.041694665 +0000 UTC m=+3017.153114002" watchObservedRunningTime="2026-02-17 14:15:43.044897066 +0000 UTC m=+3017.156316393" Feb 17 14:15:49 crc kubenswrapper[4804]: I0217 14:15:49.054016 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:49 crc kubenswrapper[4804]: I0217 14:15:49.054666 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:49 crc kubenswrapper[4804]: I0217 14:15:49.106675 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:49 crc kubenswrapper[4804]: I0217 14:15:49.153745 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:49 crc kubenswrapper[4804]: I0217 14:15:49.348115 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mhl95"] Feb 17 14:15:51 crc kubenswrapper[4804]: I0217 14:15:51.089098 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mhl95" podUID="66512154-e5a4-4d46-9d1b-a091a9f9631d" containerName="registry-server" containerID="cri-o://18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa" gracePeriod=2 Feb 17 14:15:51 crc kubenswrapper[4804]: I0217 14:15:51.585893 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:51 crc kubenswrapper[4804]: I0217 14:15:51.732413 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-utilities\") pod \"66512154-e5a4-4d46-9d1b-a091a9f9631d\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " Feb 17 14:15:51 crc kubenswrapper[4804]: I0217 14:15:51.732477 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hj8g\" (UniqueName: \"kubernetes.io/projected/66512154-e5a4-4d46-9d1b-a091a9f9631d-kube-api-access-7hj8g\") pod \"66512154-e5a4-4d46-9d1b-a091a9f9631d\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " Feb 17 14:15:51 crc kubenswrapper[4804]: I0217 14:15:51.732517 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-catalog-content\") pod \"66512154-e5a4-4d46-9d1b-a091a9f9631d\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " Feb 17 14:15:51 crc kubenswrapper[4804]: I0217 14:15:51.733469 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-utilities" (OuterVolumeSpecName: "utilities") pod "66512154-e5a4-4d46-9d1b-a091a9f9631d" (UID: "66512154-e5a4-4d46-9d1b-a091a9f9631d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:15:51 crc kubenswrapper[4804]: I0217 14:15:51.734448 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:15:51 crc kubenswrapper[4804]: I0217 14:15:51.745546 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66512154-e5a4-4d46-9d1b-a091a9f9631d-kube-api-access-7hj8g" (OuterVolumeSpecName: "kube-api-access-7hj8g") pod "66512154-e5a4-4d46-9d1b-a091a9f9631d" (UID: "66512154-e5a4-4d46-9d1b-a091a9f9631d"). InnerVolumeSpecName "kube-api-access-7hj8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:15:51 crc kubenswrapper[4804]: I0217 14:15:51.784330 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66512154-e5a4-4d46-9d1b-a091a9f9631d" (UID: "66512154-e5a4-4d46-9d1b-a091a9f9631d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:15:51 crc kubenswrapper[4804]: I0217 14:15:51.836653 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hj8g\" (UniqueName: \"kubernetes.io/projected/66512154-e5a4-4d46-9d1b-a091a9f9631d-kube-api-access-7hj8g\") on node \"crc\" DevicePath \"\"" Feb 17 14:15:51 crc kubenswrapper[4804]: I0217 14:15:51.836703 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.099839 4804 generic.go:334] "Generic (PLEG): container finished" podID="66512154-e5a4-4d46-9d1b-a091a9f9631d" containerID="18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa" exitCode=0 Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.099925 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.099907 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhl95" event={"ID":"66512154-e5a4-4d46-9d1b-a091a9f9631d","Type":"ContainerDied","Data":"18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa"} Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.100096 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhl95" event={"ID":"66512154-e5a4-4d46-9d1b-a091a9f9631d","Type":"ContainerDied","Data":"75a2dce20291625dd97df834d1901fc5bd4bff2bee391da393378fee4ed223cb"} Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.100131 4804 scope.go:117] "RemoveContainer" containerID="18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa" Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.135339 4804 scope.go:117] "RemoveContainer" containerID="fc7a42701f50553fc13c96957a701885c01b685335b98aadf5b0ab6df73eda0d" Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.151843 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mhl95"] Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.163161 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mhl95"] Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.169304 4804 scope.go:117] "RemoveContainer" containerID="9dfdcb0d9f1f2a6e6fee92b77d55f2bcf82e3cb7aa6edcaeca0ef57b91dae36a" Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.205167 4804 scope.go:117] "RemoveContainer" containerID="18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa" Feb 17 14:15:52 crc kubenswrapper[4804]: E0217 14:15:52.205613 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa\": container with ID starting with 18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa not found: ID does not exist" containerID="18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa" Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.205650 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa"} err="failed to get container status \"18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa\": rpc error: code = NotFound desc = could not find container \"18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa\": container with ID starting with 18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa not found: ID does not exist" Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.205678 4804 scope.go:117] "RemoveContainer" containerID="fc7a42701f50553fc13c96957a701885c01b685335b98aadf5b0ab6df73eda0d" Feb 17 14:15:52 crc kubenswrapper[4804]: E0217 14:15:52.205943 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc7a42701f50553fc13c96957a701885c01b685335b98aadf5b0ab6df73eda0d\": container with ID starting with fc7a42701f50553fc13c96957a701885c01b685335b98aadf5b0ab6df73eda0d not found: ID does not exist" containerID="fc7a42701f50553fc13c96957a701885c01b685335b98aadf5b0ab6df73eda0d" Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.205993 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc7a42701f50553fc13c96957a701885c01b685335b98aadf5b0ab6df73eda0d"} err="failed to get container status \"fc7a42701f50553fc13c96957a701885c01b685335b98aadf5b0ab6df73eda0d\": rpc error: code = NotFound desc = could not find container \"fc7a42701f50553fc13c96957a701885c01b685335b98aadf5b0ab6df73eda0d\": container with ID starting with fc7a42701f50553fc13c96957a701885c01b685335b98aadf5b0ab6df73eda0d not found: ID does not exist" Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.206012 4804 scope.go:117] "RemoveContainer" containerID="9dfdcb0d9f1f2a6e6fee92b77d55f2bcf82e3cb7aa6edcaeca0ef57b91dae36a" Feb 17 14:15:52 crc kubenswrapper[4804]: E0217 14:15:52.206264 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dfdcb0d9f1f2a6e6fee92b77d55f2bcf82e3cb7aa6edcaeca0ef57b91dae36a\": container with ID starting with 9dfdcb0d9f1f2a6e6fee92b77d55f2bcf82e3cb7aa6edcaeca0ef57b91dae36a not found: ID does not exist" containerID="9dfdcb0d9f1f2a6e6fee92b77d55f2bcf82e3cb7aa6edcaeca0ef57b91dae36a" Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.206294 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dfdcb0d9f1f2a6e6fee92b77d55f2bcf82e3cb7aa6edcaeca0ef57b91dae36a"} err="failed to get container status \"9dfdcb0d9f1f2a6e6fee92b77d55f2bcf82e3cb7aa6edcaeca0ef57b91dae36a\": rpc error: code = NotFound desc = could not find container \"9dfdcb0d9f1f2a6e6fee92b77d55f2bcf82e3cb7aa6edcaeca0ef57b91dae36a\": container with ID starting with 9dfdcb0d9f1f2a6e6fee92b77d55f2bcf82e3cb7aa6edcaeca0ef57b91dae36a not found: ID does not exist" Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.587441 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66512154-e5a4-4d46-9d1b-a091a9f9631d" path="/var/lib/kubelet/pods/66512154-e5a4-4d46-9d1b-a091a9f9631d/volumes" Feb 17 14:16:25 crc kubenswrapper[4804]: I0217 14:16:25.835682 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:16:25 crc kubenswrapper[4804]: I0217 14:16:25.836398 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:16:55 crc kubenswrapper[4804]: I0217 14:16:55.835828 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:16:55 crc kubenswrapper[4804]: I0217 14:16:55.838458 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:17:25 crc kubenswrapper[4804]: I0217 14:17:25.835838 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:17:25 crc kubenswrapper[4804]: I0217 14:17:25.836561 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:17:25 crc kubenswrapper[4804]: I0217 14:17:25.836633 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 14:17:25 crc kubenswrapper[4804]: I0217 14:17:25.837696 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:17:25 crc kubenswrapper[4804]: I0217 14:17:25.837768 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" gracePeriod=600 Feb 17 14:17:25 crc kubenswrapper[4804]: E0217 14:17:25.966914 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:17:26 crc kubenswrapper[4804]: I0217 14:17:26.966435 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" exitCode=0 Feb 17 14:17:26 crc kubenswrapper[4804]: I0217 14:17:26.966529 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce"} Feb 17 14:17:26 crc kubenswrapper[4804]: I0217 14:17:26.966884 4804 scope.go:117] "RemoveContainer" containerID="3270aad0cb169aa901bc212cfebf2f76e758028c073b7389992ac0738318b723" Feb 17 14:17:26 crc kubenswrapper[4804]: I0217 14:17:26.967914 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:17:26 crc kubenswrapper[4804]: E0217 14:17:26.968514 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:17:37 crc kubenswrapper[4804]: I0217 14:17:37.574810 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:17:37 crc kubenswrapper[4804]: E0217 14:17:37.575966 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:17:48 crc kubenswrapper[4804]: I0217 14:17:48.575163 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:17:48 crc kubenswrapper[4804]: E0217 14:17:48.576061 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:18:03 crc kubenswrapper[4804]: I0217 14:18:03.574371 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:18:03 crc kubenswrapper[4804]: E0217 14:18:03.575317 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:18:17 crc kubenswrapper[4804]: I0217 14:18:17.574836 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:18:17 crc kubenswrapper[4804]: E0217 14:18:17.576418 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:18:31 crc kubenswrapper[4804]: I0217 14:18:31.574892 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:18:31 crc kubenswrapper[4804]: E0217 14:18:31.576219 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:18:42 crc kubenswrapper[4804]: I0217 14:18:42.574038 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:18:42 crc kubenswrapper[4804]: E0217 14:18:42.574828 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:18:54 crc kubenswrapper[4804]: I0217 14:18:54.574749 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:18:54 crc kubenswrapper[4804]: E0217 14:18:54.575622 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:19:08 crc kubenswrapper[4804]: I0217 14:19:08.574744 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:19:08 crc kubenswrapper[4804]: E0217 14:19:08.575794 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:19:19 crc kubenswrapper[4804]: I0217 14:19:19.574037 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:19:19 crc kubenswrapper[4804]: E0217 14:19:19.574826 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:19:32 crc kubenswrapper[4804]: I0217 14:19:32.574603 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:19:32 crc kubenswrapper[4804]: E0217 14:19:32.575334 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:19:43 crc kubenswrapper[4804]: I0217 14:19:43.573837 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:19:43 crc kubenswrapper[4804]: E0217 14:19:43.574658 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:19:55 crc kubenswrapper[4804]: I0217 14:19:55.574350 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:19:55 crc kubenswrapper[4804]: E0217 14:19:55.575183 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:20:08 crc kubenswrapper[4804]: I0217 14:20:08.575123 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:20:08 crc kubenswrapper[4804]: E0217 14:20:08.576351 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:20:22 crc kubenswrapper[4804]: I0217 14:20:22.574348 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:20:22 crc kubenswrapper[4804]: E0217 14:20:22.575324 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:20:36 crc kubenswrapper[4804]: I0217 14:20:36.580163 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:20:36 crc kubenswrapper[4804]: E0217 14:20:36.581096 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:20:50 crc kubenswrapper[4804]: I0217 14:20:50.573860 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:20:50 crc kubenswrapper[4804]: E0217 14:20:50.574779 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:21:03 crc kubenswrapper[4804]: I0217 14:21:03.575029 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:21:03 crc kubenswrapper[4804]: E0217 14:21:03.577145 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.391115 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nnnnl"] Feb 17 14:21:06 crc kubenswrapper[4804]: E0217 14:21:06.391620 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66512154-e5a4-4d46-9d1b-a091a9f9631d" containerName="extract-utilities" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.391892 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="66512154-e5a4-4d46-9d1b-a091a9f9631d" containerName="extract-utilities" Feb 17 14:21:06 crc kubenswrapper[4804]: E0217 14:21:06.391923 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66512154-e5a4-4d46-9d1b-a091a9f9631d" containerName="registry-server" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.391933 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="66512154-e5a4-4d46-9d1b-a091a9f9631d" containerName="registry-server" Feb 17 14:21:06 crc kubenswrapper[4804]: E0217 14:21:06.391963 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66512154-e5a4-4d46-9d1b-a091a9f9631d" containerName="extract-content" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.391971 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="66512154-e5a4-4d46-9d1b-a091a9f9631d" containerName="extract-content" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.392174 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="66512154-e5a4-4d46-9d1b-a091a9f9631d" containerName="registry-server" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.393653 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.412724 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nnnnl"] Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.542827 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-catalog-content\") pod \"redhat-operators-nnnnl\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.542905 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-utilities\") pod \"redhat-operators-nnnnl\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.542998 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qb85\" (UniqueName: \"kubernetes.io/projected/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-kube-api-access-5qb85\") pod \"redhat-operators-nnnnl\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.644583 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-utilities\") pod \"redhat-operators-nnnnl\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.644685 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qb85\" (UniqueName: \"kubernetes.io/projected/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-kube-api-access-5qb85\") pod \"redhat-operators-nnnnl\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.644810 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-catalog-content\") pod \"redhat-operators-nnnnl\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.645284 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-catalog-content\") pod \"redhat-operators-nnnnl\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.645714 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-utilities\") pod \"redhat-operators-nnnnl\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.669681 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qb85\" (UniqueName: \"kubernetes.io/projected/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-kube-api-access-5qb85\") pod \"redhat-operators-nnnnl\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.730750 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:07 crc kubenswrapper[4804]: I0217 14:21:07.216497 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nnnnl"] Feb 17 14:21:07 crc kubenswrapper[4804]: I0217 14:21:07.459718 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnnnl" event={"ID":"49a4d451-c363-4a82-aa9d-78f76fb0eb2f","Type":"ContainerStarted","Data":"afdba798f22337cb141f4e18f48d4777e222a4f386e4ae115dbbb8ab5c633f89"} Feb 17 14:21:08 crc kubenswrapper[4804]: I0217 14:21:08.467902 4804 generic.go:334] "Generic (PLEG): container finished" podID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerID="9cbcce906e45e234e41e79feed19f281c1d1e0f6849fa7a707466990d423c9ad" exitCode=0 Feb 17 14:21:08 crc kubenswrapper[4804]: I0217 14:21:08.468038 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnnnl" event={"ID":"49a4d451-c363-4a82-aa9d-78f76fb0eb2f","Type":"ContainerDied","Data":"9cbcce906e45e234e41e79feed19f281c1d1e0f6849fa7a707466990d423c9ad"} Feb 17 14:21:08 crc kubenswrapper[4804]: I0217 14:21:08.470927 4804 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:21:09 crc kubenswrapper[4804]: I0217 14:21:09.478001 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnnnl" event={"ID":"49a4d451-c363-4a82-aa9d-78f76fb0eb2f","Type":"ContainerStarted","Data":"b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1"} Feb 17 14:21:10 crc kubenswrapper[4804]: I0217 14:21:10.491489 4804 generic.go:334] "Generic (PLEG): container finished" podID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerID="b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1" exitCode=0 Feb 17 14:21:10 crc kubenswrapper[4804]: I0217 14:21:10.491544 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnnnl" event={"ID":"49a4d451-c363-4a82-aa9d-78f76fb0eb2f","Type":"ContainerDied","Data":"b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1"} Feb 17 14:21:11 crc kubenswrapper[4804]: I0217 14:21:11.505261 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnnnl" event={"ID":"49a4d451-c363-4a82-aa9d-78f76fb0eb2f","Type":"ContainerStarted","Data":"6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1"} Feb 17 14:21:11 crc kubenswrapper[4804]: I0217 14:21:11.534048 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nnnnl" podStartSLOduration=3.096888525 podStartE2EDuration="5.534023433s" podCreationTimestamp="2026-02-17 14:21:06 +0000 UTC" firstStartedPulling="2026-02-17 14:21:08.470449108 +0000 UTC m=+3342.581868445" lastFinishedPulling="2026-02-17 14:21:10.907584016 +0000 UTC m=+3345.019003353" observedRunningTime="2026-02-17 14:21:11.522113489 +0000 UTC m=+3345.633532826" watchObservedRunningTime="2026-02-17 14:21:11.534023433 +0000 UTC m=+3345.645442770" Feb 17 14:21:16 crc kubenswrapper[4804]: I0217 14:21:16.600584 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:21:16 crc kubenswrapper[4804]: E0217 14:21:16.602063 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:21:16 crc kubenswrapper[4804]: I0217 14:21:16.733084 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:16 crc kubenswrapper[4804]: I0217 14:21:16.733462 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:17 crc kubenswrapper[4804]: I0217 14:21:17.786874 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nnnnl" podUID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerName="registry-server" probeResult="failure" output=< Feb 17 14:21:17 crc kubenswrapper[4804]: timeout: failed to connect service ":50051" within 1s Feb 17 14:21:17 crc kubenswrapper[4804]: > Feb 17 14:21:26 crc kubenswrapper[4804]: I0217 14:21:26.782109 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:26 crc kubenswrapper[4804]: I0217 14:21:26.834256 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:27 crc kubenswrapper[4804]: I0217 14:21:27.022971 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nnnnl"] Feb 17 14:21:27 crc kubenswrapper[4804]: I0217 14:21:27.573895 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:21:27 crc kubenswrapper[4804]: E0217 14:21:27.574173 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:21:28 crc kubenswrapper[4804]: I0217 14:21:28.652958 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nnnnl" podUID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerName="registry-server" containerID="cri-o://6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1" gracePeriod=2 Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.186136 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.294400 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-catalog-content\") pod \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.294610 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qb85\" (UniqueName: \"kubernetes.io/projected/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-kube-api-access-5qb85\") pod \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.294689 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-utilities\") pod \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.295824 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-utilities" (OuterVolumeSpecName: "utilities") pod "49a4d451-c363-4a82-aa9d-78f76fb0eb2f" (UID: "49a4d451-c363-4a82-aa9d-78f76fb0eb2f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.306383 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-kube-api-access-5qb85" (OuterVolumeSpecName: "kube-api-access-5qb85") pod "49a4d451-c363-4a82-aa9d-78f76fb0eb2f" (UID: "49a4d451-c363-4a82-aa9d-78f76fb0eb2f"). InnerVolumeSpecName "kube-api-access-5qb85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.397273 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qb85\" (UniqueName: \"kubernetes.io/projected/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-kube-api-access-5qb85\") on node \"crc\" DevicePath \"\"" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.397310 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.423950 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49a4d451-c363-4a82-aa9d-78f76fb0eb2f" (UID: "49a4d451-c363-4a82-aa9d-78f76fb0eb2f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.499234 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.664812 4804 generic.go:334] "Generic (PLEG): container finished" podID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerID="6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1" exitCode=0 Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.664874 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnnnl" event={"ID":"49a4d451-c363-4a82-aa9d-78f76fb0eb2f","Type":"ContainerDied","Data":"6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1"} Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.664903 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.664930 4804 scope.go:117] "RemoveContainer" containerID="6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.664913 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnnnl" event={"ID":"49a4d451-c363-4a82-aa9d-78f76fb0eb2f","Type":"ContainerDied","Data":"afdba798f22337cb141f4e18f48d4777e222a4f386e4ae115dbbb8ab5c633f89"} Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.692349 4804 scope.go:117] "RemoveContainer" containerID="b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.707191 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nnnnl"] Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.716590 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nnnnl"] Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.735597 4804 scope.go:117] "RemoveContainer" containerID="9cbcce906e45e234e41e79feed19f281c1d1e0f6849fa7a707466990d423c9ad" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.770458 4804 scope.go:117] "RemoveContainer" containerID="6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1" Feb 17 14:21:29 crc kubenswrapper[4804]: E0217 14:21:29.770829 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1\": container with ID starting with 6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1 not found: ID does not exist" containerID="6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.770878 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1"} err="failed to get container status \"6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1\": rpc error: code = NotFound desc = could not find container \"6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1\": container with ID starting with 6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1 not found: ID does not exist" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.770919 4804 scope.go:117] "RemoveContainer" containerID="b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1" Feb 17 14:21:29 crc kubenswrapper[4804]: E0217 14:21:29.771319 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1\": container with ID starting with b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1 not found: ID does not exist" containerID="b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.771344 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1"} err="failed to get container status \"b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1\": rpc error: code = NotFound desc = could not find container \"b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1\": container with ID starting with b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1 not found: ID does not exist" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.771360 4804 scope.go:117] "RemoveContainer" containerID="9cbcce906e45e234e41e79feed19f281c1d1e0f6849fa7a707466990d423c9ad" Feb 17 14:21:29 crc kubenswrapper[4804]: E0217 14:21:29.771552 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cbcce906e45e234e41e79feed19f281c1d1e0f6849fa7a707466990d423c9ad\": container with ID starting with 9cbcce906e45e234e41e79feed19f281c1d1e0f6849fa7a707466990d423c9ad not found: ID does not exist" containerID="9cbcce906e45e234e41e79feed19f281c1d1e0f6849fa7a707466990d423c9ad" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.771580 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cbcce906e45e234e41e79feed19f281c1d1e0f6849fa7a707466990d423c9ad"} err="failed to get container status \"9cbcce906e45e234e41e79feed19f281c1d1e0f6849fa7a707466990d423c9ad\": rpc error: code = NotFound desc = could not find container \"9cbcce906e45e234e41e79feed19f281c1d1e0f6849fa7a707466990d423c9ad\": container with ID starting with 9cbcce906e45e234e41e79feed19f281c1d1e0f6849fa7a707466990d423c9ad not found: ID does not exist" Feb 17 14:21:30 crc kubenswrapper[4804]: I0217 14:21:30.584634 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" path="/var/lib/kubelet/pods/49a4d451-c363-4a82-aa9d-78f76fb0eb2f/volumes" Feb 17 14:21:40 crc kubenswrapper[4804]: I0217 14:21:40.575532 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:21:40 crc kubenswrapper[4804]: E0217 14:21:40.577020 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:21:54 crc kubenswrapper[4804]: I0217 14:21:54.573903 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:21:54 crc kubenswrapper[4804]: E0217 14:21:54.574635 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:22:08 crc kubenswrapper[4804]: I0217 14:22:08.574183 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:22:08 crc kubenswrapper[4804]: E0217 14:22:08.575092 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:22:22 crc kubenswrapper[4804]: I0217 14:22:22.574131 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:22:22 crc kubenswrapper[4804]: E0217 14:22:22.575094 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:22:34 crc kubenswrapper[4804]: I0217 14:22:34.574541 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:22:35 crc kubenswrapper[4804]: I0217 14:22:35.298183 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"7d4a85eb120e127ec0b1aabea32973bfd4724fea056face9a5b718c636d4f49a"} Feb 17 14:22:36 crc kubenswrapper[4804]: I0217 14:22:36.324710 4804 generic.go:334] "Generic (PLEG): container finished" podID="f7b246dc-1d07-4725-b471-88fe82584d24" containerID="d537c8e502573d470d3444dc025ba077411e9d8c16e3d0c7fcbea501f31e4c98" exitCode=0 Feb 17 14:22:36 crc kubenswrapper[4804]: I0217 14:22:36.324991 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"f7b246dc-1d07-4725-b471-88fe82584d24","Type":"ContainerDied","Data":"d537c8e502573d470d3444dc025ba077411e9d8c16e3d0c7fcbea501f31e4c98"} Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.754407 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.831061 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config-secret\") pod \"f7b246dc-1d07-4725-b471-88fe82584d24\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.831131 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-workdir\") pod \"f7b246dc-1d07-4725-b471-88fe82584d24\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.831165 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-548bt\" (UniqueName: \"kubernetes.io/projected/f7b246dc-1d07-4725-b471-88fe82584d24-kube-api-access-548bt\") pod \"f7b246dc-1d07-4725-b471-88fe82584d24\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.831183 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-temporary\") pod \"f7b246dc-1d07-4725-b471-88fe82584d24\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.831240 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config\") pod \"f7b246dc-1d07-4725-b471-88fe82584d24\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.831312 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-config-data\") pod \"f7b246dc-1d07-4725-b471-88fe82584d24\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.831347 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ssh-key\") pod \"f7b246dc-1d07-4725-b471-88fe82584d24\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.831377 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"f7b246dc-1d07-4725-b471-88fe82584d24\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.831470 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ca-certs\") pod \"f7b246dc-1d07-4725-b471-88fe82584d24\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.832318 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "f7b246dc-1d07-4725-b471-88fe82584d24" (UID: "f7b246dc-1d07-4725-b471-88fe82584d24"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.832908 4804 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.832914 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-config-data" (OuterVolumeSpecName: "config-data") pod "f7b246dc-1d07-4725-b471-88fe82584d24" (UID: "f7b246dc-1d07-4725-b471-88fe82584d24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.836763 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7b246dc-1d07-4725-b471-88fe82584d24-kube-api-access-548bt" (OuterVolumeSpecName: "kube-api-access-548bt") pod "f7b246dc-1d07-4725-b471-88fe82584d24" (UID: "f7b246dc-1d07-4725-b471-88fe82584d24"). InnerVolumeSpecName "kube-api-access-548bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.839101 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "f7b246dc-1d07-4725-b471-88fe82584d24" (UID: "f7b246dc-1d07-4725-b471-88fe82584d24"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.835509 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "f7b246dc-1d07-4725-b471-88fe82584d24" (UID: "f7b246dc-1d07-4725-b471-88fe82584d24"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.860950 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "f7b246dc-1d07-4725-b471-88fe82584d24" (UID: "f7b246dc-1d07-4725-b471-88fe82584d24"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.861803 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f7b246dc-1d07-4725-b471-88fe82584d24" (UID: "f7b246dc-1d07-4725-b471-88fe82584d24"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.863311 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f7b246dc-1d07-4725-b471-88fe82584d24" (UID: "f7b246dc-1d07-4725-b471-88fe82584d24"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.878768 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "f7b246dc-1d07-4725-b471-88fe82584d24" (UID: "f7b246dc-1d07-4725-b471-88fe82584d24"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.935095 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.935133 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.935165 4804 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.935174 4804 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.935186 4804 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.935212 4804 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.935224 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-548bt\" (UniqueName: \"kubernetes.io/projected/f7b246dc-1d07-4725-b471-88fe82584d24-kube-api-access-548bt\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.935233 4804 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.955127 4804 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 17 14:22:38 crc kubenswrapper[4804]: I0217 14:22:38.037230 4804 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:38 crc kubenswrapper[4804]: I0217 14:22:38.343753 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 17 14:22:38 crc kubenswrapper[4804]: I0217 14:22:38.343756 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"f7b246dc-1d07-4725-b471-88fe82584d24","Type":"ContainerDied","Data":"35721c59346596c631486087761565338b01be7cc9c8b0659285af567a265321"} Feb 17 14:22:38 crc kubenswrapper[4804]: I0217 14:22:38.344500 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35721c59346596c631486087761565338b01be7cc9c8b0659285af567a265321" Feb 17 14:22:41 crc kubenswrapper[4804]: I0217 14:22:41.964544 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8chgg"] Feb 17 14:22:41 crc kubenswrapper[4804]: E0217 14:22:41.965511 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerName="extract-content" Feb 17 14:22:41 crc kubenswrapper[4804]: I0217 14:22:41.965528 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerName="extract-content" Feb 17 14:22:41 crc kubenswrapper[4804]: E0217 14:22:41.965541 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7b246dc-1d07-4725-b471-88fe82584d24" containerName="tempest-tests-tempest-tests-runner" Feb 17 14:22:41 crc kubenswrapper[4804]: I0217 14:22:41.965548 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7b246dc-1d07-4725-b471-88fe82584d24" containerName="tempest-tests-tempest-tests-runner" Feb 17 14:22:41 crc kubenswrapper[4804]: E0217 14:22:41.965562 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerName="extract-utilities" Feb 17 14:22:41 crc kubenswrapper[4804]: I0217 14:22:41.965568 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerName="extract-utilities" Feb 17 14:22:41 crc kubenswrapper[4804]: E0217 14:22:41.965583 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerName="registry-server" Feb 17 14:22:41 crc kubenswrapper[4804]: I0217 14:22:41.965589 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerName="registry-server" Feb 17 14:22:41 crc kubenswrapper[4804]: I0217 14:22:41.965752 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerName="registry-server" Feb 17 14:22:41 crc kubenswrapper[4804]: I0217 14:22:41.965764 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7b246dc-1d07-4725-b471-88fe82584d24" containerName="tempest-tests-tempest-tests-runner" Feb 17 14:22:41 crc kubenswrapper[4804]: I0217 14:22:41.967212 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:41 crc kubenswrapper[4804]: I0217 14:22:41.979613 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8chgg"] Feb 17 14:22:42 crc kubenswrapper[4804]: I0217 14:22:42.015602 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-utilities\") pod \"redhat-marketplace-8chgg\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:42 crc kubenswrapper[4804]: I0217 14:22:42.015669 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-catalog-content\") pod \"redhat-marketplace-8chgg\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:42 crc kubenswrapper[4804]: I0217 14:22:42.015692 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skv7q\" (UniqueName: \"kubernetes.io/projected/dde1b880-fcbe-493d-85e0-44763ee6e1f8-kube-api-access-skv7q\") pod \"redhat-marketplace-8chgg\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:42 crc kubenswrapper[4804]: I0217 14:22:42.117409 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-utilities\") pod \"redhat-marketplace-8chgg\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:42 crc kubenswrapper[4804]: I0217 14:22:42.117497 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-catalog-content\") pod \"redhat-marketplace-8chgg\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:42 crc kubenswrapper[4804]: I0217 14:22:42.117520 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skv7q\" (UniqueName: \"kubernetes.io/projected/dde1b880-fcbe-493d-85e0-44763ee6e1f8-kube-api-access-skv7q\") pod \"redhat-marketplace-8chgg\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:42 crc kubenswrapper[4804]: I0217 14:22:42.118008 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-utilities\") pod \"redhat-marketplace-8chgg\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:42 crc kubenswrapper[4804]: I0217 14:22:42.118312 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-catalog-content\") pod \"redhat-marketplace-8chgg\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:42 crc kubenswrapper[4804]: I0217 14:22:42.136454 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skv7q\" (UniqueName: \"kubernetes.io/projected/dde1b880-fcbe-493d-85e0-44763ee6e1f8-kube-api-access-skv7q\") pod \"redhat-marketplace-8chgg\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:42 crc kubenswrapper[4804]: I0217 14:22:42.289544 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:42 crc kubenswrapper[4804]: I0217 14:22:42.746232 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8chgg"] Feb 17 14:22:43 crc kubenswrapper[4804]: I0217 14:22:43.425495 4804 generic.go:334] "Generic (PLEG): container finished" podID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" containerID="4cdc2945c974221ef0ceae74cdf7d55e0de10f77f1b218b174597c608971b348" exitCode=0 Feb 17 14:22:43 crc kubenswrapper[4804]: I0217 14:22:43.425563 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8chgg" event={"ID":"dde1b880-fcbe-493d-85e0-44763ee6e1f8","Type":"ContainerDied","Data":"4cdc2945c974221ef0ceae74cdf7d55e0de10f77f1b218b174597c608971b348"} Feb 17 14:22:43 crc kubenswrapper[4804]: I0217 14:22:43.426008 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8chgg" event={"ID":"dde1b880-fcbe-493d-85e0-44763ee6e1f8","Type":"ContainerStarted","Data":"c32ab401f3ef9117e51efe6faab4692742c641c8ab361c05e3938b5036ba0972"} Feb 17 14:22:44 crc kubenswrapper[4804]: I0217 14:22:44.441444 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8chgg" event={"ID":"dde1b880-fcbe-493d-85e0-44763ee6e1f8","Type":"ContainerStarted","Data":"4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74"} Feb 17 14:22:45 crc kubenswrapper[4804]: I0217 14:22:45.453183 4804 generic.go:334] "Generic (PLEG): container finished" podID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" containerID="4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74" exitCode=0 Feb 17 14:22:45 crc kubenswrapper[4804]: I0217 14:22:45.453256 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8chgg" event={"ID":"dde1b880-fcbe-493d-85e0-44763ee6e1f8","Type":"ContainerDied","Data":"4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74"} Feb 17 14:22:46 crc kubenswrapper[4804]: I0217 14:22:46.464757 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8chgg" event={"ID":"dde1b880-fcbe-493d-85e0-44763ee6e1f8","Type":"ContainerStarted","Data":"5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe"} Feb 17 14:22:46 crc kubenswrapper[4804]: I0217 14:22:46.492155 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8chgg" podStartSLOduration=3.07562732 podStartE2EDuration="5.492134866s" podCreationTimestamp="2026-02-17 14:22:41 +0000 UTC" firstStartedPulling="2026-02-17 14:22:43.427821415 +0000 UTC m=+3437.539240752" lastFinishedPulling="2026-02-17 14:22:45.844328961 +0000 UTC m=+3439.955748298" observedRunningTime="2026-02-17 14:22:46.483433743 +0000 UTC m=+3440.594853090" watchObservedRunningTime="2026-02-17 14:22:46.492134866 +0000 UTC m=+3440.603554193" Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.347196 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.349047 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.351167 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-kssn4" Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.383532 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.466574 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4c6dcbcb-8248-40b5-8fd6-7824c487109e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.466732 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t626\" (UniqueName: \"kubernetes.io/projected/4c6dcbcb-8248-40b5-8fd6-7824c487109e-kube-api-access-5t626\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4c6dcbcb-8248-40b5-8fd6-7824c487109e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.567968 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t626\" (UniqueName: \"kubernetes.io/projected/4c6dcbcb-8248-40b5-8fd6-7824c487109e-kube-api-access-5t626\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4c6dcbcb-8248-40b5-8fd6-7824c487109e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.568056 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4c6dcbcb-8248-40b5-8fd6-7824c487109e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.568455 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4c6dcbcb-8248-40b5-8fd6-7824c487109e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.595699 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t626\" (UniqueName: \"kubernetes.io/projected/4c6dcbcb-8248-40b5-8fd6-7824c487109e-kube-api-access-5t626\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4c6dcbcb-8248-40b5-8fd6-7824c487109e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.596306 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4c6dcbcb-8248-40b5-8fd6-7824c487109e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.691162 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 14:22:50 crc kubenswrapper[4804]: I0217 14:22:50.130547 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 17 14:22:50 crc kubenswrapper[4804]: W0217 14:22:50.135396 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c6dcbcb_8248_40b5_8fd6_7824c487109e.slice/crio-18a8be274eae3cb9141d7d3f769ba4cf263179c0419d3df5388160a88be480fe WatchSource:0}: Error finding container 18a8be274eae3cb9141d7d3f769ba4cf263179c0419d3df5388160a88be480fe: Status 404 returned error can't find the container with id 18a8be274eae3cb9141d7d3f769ba4cf263179c0419d3df5388160a88be480fe Feb 17 14:22:50 crc kubenswrapper[4804]: I0217 14:22:50.688721 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"4c6dcbcb-8248-40b5-8fd6-7824c487109e","Type":"ContainerStarted","Data":"18a8be274eae3cb9141d7d3f769ba4cf263179c0419d3df5388160a88be480fe"} Feb 17 14:22:52 crc kubenswrapper[4804]: I0217 14:22:52.290531 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:52 crc kubenswrapper[4804]: I0217 14:22:52.292235 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:52 crc kubenswrapper[4804]: I0217 14:22:52.346129 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:52 crc kubenswrapper[4804]: I0217 14:22:52.704387 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"4c6dcbcb-8248-40b5-8fd6-7824c487109e","Type":"ContainerStarted","Data":"63f7e5eaa00772f47394801064b6c0c3f65c0725404e6632fc6fc9a62c196e00"} Feb 17 14:22:52 crc kubenswrapper[4804]: I0217 14:22:52.719484 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.401090714 podStartE2EDuration="3.719468024s" podCreationTimestamp="2026-02-17 14:22:49 +0000 UTC" firstStartedPulling="2026-02-17 14:22:50.1375849 +0000 UTC m=+3444.249004237" lastFinishedPulling="2026-02-17 14:22:52.45596221 +0000 UTC m=+3446.567381547" observedRunningTime="2026-02-17 14:22:52.71747183 +0000 UTC m=+3446.828891187" watchObservedRunningTime="2026-02-17 14:22:52.719468024 +0000 UTC m=+3446.830887361" Feb 17 14:22:52 crc kubenswrapper[4804]: I0217 14:22:52.757386 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:52 crc kubenswrapper[4804]: I0217 14:22:52.810167 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8chgg"] Feb 17 14:22:54 crc kubenswrapper[4804]: I0217 14:22:54.722575 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8chgg" podUID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" containerName="registry-server" containerID="cri-o://5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe" gracePeriod=2 Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.156468 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.271186 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-catalog-content\") pod \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.271333 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-utilities\") pod \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.271422 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skv7q\" (UniqueName: \"kubernetes.io/projected/dde1b880-fcbe-493d-85e0-44763ee6e1f8-kube-api-access-skv7q\") pod \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.272247 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-utilities" (OuterVolumeSpecName: "utilities") pod "dde1b880-fcbe-493d-85e0-44763ee6e1f8" (UID: "dde1b880-fcbe-493d-85e0-44763ee6e1f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.276385 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dde1b880-fcbe-493d-85e0-44763ee6e1f8-kube-api-access-skv7q" (OuterVolumeSpecName: "kube-api-access-skv7q") pod "dde1b880-fcbe-493d-85e0-44763ee6e1f8" (UID: "dde1b880-fcbe-493d-85e0-44763ee6e1f8"). InnerVolumeSpecName "kube-api-access-skv7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.295568 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dde1b880-fcbe-493d-85e0-44763ee6e1f8" (UID: "dde1b880-fcbe-493d-85e0-44763ee6e1f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.374086 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.374126 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skv7q\" (UniqueName: \"kubernetes.io/projected/dde1b880-fcbe-493d-85e0-44763ee6e1f8-kube-api-access-skv7q\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.374136 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.734323 4804 generic.go:334] "Generic (PLEG): container finished" podID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" containerID="5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe" exitCode=0 Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.734400 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8chgg" event={"ID":"dde1b880-fcbe-493d-85e0-44763ee6e1f8","Type":"ContainerDied","Data":"5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe"} Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.734747 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8chgg" event={"ID":"dde1b880-fcbe-493d-85e0-44763ee6e1f8","Type":"ContainerDied","Data":"c32ab401f3ef9117e51efe6faab4692742c641c8ab361c05e3938b5036ba0972"} Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.734789 4804 scope.go:117] "RemoveContainer" containerID="5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.734419 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.759663 4804 scope.go:117] "RemoveContainer" containerID="4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.781856 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8chgg"] Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.790706 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8chgg"] Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.803149 4804 scope.go:117] "RemoveContainer" containerID="4cdc2945c974221ef0ceae74cdf7d55e0de10f77f1b218b174597c608971b348" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.830711 4804 scope.go:117] "RemoveContainer" containerID="5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe" Feb 17 14:22:55 crc kubenswrapper[4804]: E0217 14:22:55.831092 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe\": container with ID starting with 5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe not found: ID does not exist" containerID="5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.831131 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe"} err="failed to get container status \"5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe\": rpc error: code = NotFound desc = could not find container \"5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe\": container with ID starting with 5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe not found: ID does not exist" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.831159 4804 scope.go:117] "RemoveContainer" containerID="4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74" Feb 17 14:22:55 crc kubenswrapper[4804]: E0217 14:22:55.831480 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74\": container with ID starting with 4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74 not found: ID does not exist" containerID="4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.831559 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74"} err="failed to get container status \"4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74\": rpc error: code = NotFound desc = could not find container \"4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74\": container with ID starting with 4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74 not found: ID does not exist" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.831590 4804 scope.go:117] "RemoveContainer" containerID="4cdc2945c974221ef0ceae74cdf7d55e0de10f77f1b218b174597c608971b348" Feb 17 14:22:55 crc kubenswrapper[4804]: E0217 14:22:55.831835 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cdc2945c974221ef0ceae74cdf7d55e0de10f77f1b218b174597c608971b348\": container with ID starting with 4cdc2945c974221ef0ceae74cdf7d55e0de10f77f1b218b174597c608971b348 not found: ID does not exist" containerID="4cdc2945c974221ef0ceae74cdf7d55e0de10f77f1b218b174597c608971b348" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.831856 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cdc2945c974221ef0ceae74cdf7d55e0de10f77f1b218b174597c608971b348"} err="failed to get container status \"4cdc2945c974221ef0ceae74cdf7d55e0de10f77f1b218b174597c608971b348\": rpc error: code = NotFound desc = could not find container \"4cdc2945c974221ef0ceae74cdf7d55e0de10f77f1b218b174597c608971b348\": container with ID starting with 4cdc2945c974221ef0ceae74cdf7d55e0de10f77f1b218b174597c608971b348 not found: ID does not exist" Feb 17 14:22:56 crc kubenswrapper[4804]: I0217 14:22:56.586486 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" path="/var/lib/kubelet/pods/dde1b880-fcbe-493d-85e0-44763ee6e1f8/volumes" Feb 17 14:23:13 crc kubenswrapper[4804]: I0217 14:23:13.938387 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4k4qm/must-gather-49hd6"] Feb 17 14:23:13 crc kubenswrapper[4804]: E0217 14:23:13.939426 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" containerName="registry-server" Feb 17 14:23:13 crc kubenswrapper[4804]: I0217 14:23:13.939444 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" containerName="registry-server" Feb 17 14:23:13 crc kubenswrapper[4804]: E0217 14:23:13.939462 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" containerName="extract-utilities" Feb 17 14:23:13 crc kubenswrapper[4804]: I0217 14:23:13.939471 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" containerName="extract-utilities" Feb 17 14:23:13 crc kubenswrapper[4804]: E0217 14:23:13.939513 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" containerName="extract-content" Feb 17 14:23:13 crc kubenswrapper[4804]: I0217 14:23:13.939521 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" containerName="extract-content" Feb 17 14:23:13 crc kubenswrapper[4804]: I0217 14:23:13.939729 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" containerName="registry-server" Feb 17 14:23:13 crc kubenswrapper[4804]: I0217 14:23:13.944245 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/must-gather-49hd6" Feb 17 14:23:13 crc kubenswrapper[4804]: I0217 14:23:13.946598 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4k4qm"/"default-dockercfg-p49xk" Feb 17 14:23:13 crc kubenswrapper[4804]: I0217 14:23:13.947066 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4k4qm"/"openshift-service-ca.crt" Feb 17 14:23:13 crc kubenswrapper[4804]: I0217 14:23:13.951944 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4k4qm"/"kube-root-ca.crt" Feb 17 14:23:13 crc kubenswrapper[4804]: I0217 14:23:13.951969 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4k4qm/must-gather-49hd6"] Feb 17 14:23:14 crc kubenswrapper[4804]: I0217 14:23:14.060581 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlhtq\" (UniqueName: \"kubernetes.io/projected/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-kube-api-access-zlhtq\") pod \"must-gather-49hd6\" (UID: \"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a\") " pod="openshift-must-gather-4k4qm/must-gather-49hd6" Feb 17 14:23:14 crc kubenswrapper[4804]: I0217 14:23:14.063318 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-must-gather-output\") pod \"must-gather-49hd6\" (UID: \"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a\") " pod="openshift-must-gather-4k4qm/must-gather-49hd6" Feb 17 14:23:14 crc kubenswrapper[4804]: I0217 14:23:14.165403 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlhtq\" (UniqueName: \"kubernetes.io/projected/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-kube-api-access-zlhtq\") pod \"must-gather-49hd6\" (UID: \"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a\") " pod="openshift-must-gather-4k4qm/must-gather-49hd6" Feb 17 14:23:14 crc kubenswrapper[4804]: I0217 14:23:14.165481 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-must-gather-output\") pod \"must-gather-49hd6\" (UID: \"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a\") " pod="openshift-must-gather-4k4qm/must-gather-49hd6" Feb 17 14:23:14 crc kubenswrapper[4804]: I0217 14:23:14.165994 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-must-gather-output\") pod \"must-gather-49hd6\" (UID: \"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a\") " pod="openshift-must-gather-4k4qm/must-gather-49hd6" Feb 17 14:23:14 crc kubenswrapper[4804]: I0217 14:23:14.183338 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlhtq\" (UniqueName: \"kubernetes.io/projected/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-kube-api-access-zlhtq\") pod \"must-gather-49hd6\" (UID: \"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a\") " pod="openshift-must-gather-4k4qm/must-gather-49hd6" Feb 17 14:23:14 crc kubenswrapper[4804]: I0217 14:23:14.266128 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/must-gather-49hd6" Feb 17 14:23:14 crc kubenswrapper[4804]: I0217 14:23:14.756807 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4k4qm/must-gather-49hd6"] Feb 17 14:23:14 crc kubenswrapper[4804]: I0217 14:23:14.926429 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4qm/must-gather-49hd6" event={"ID":"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a","Type":"ContainerStarted","Data":"47b1ad9526c381d40fe9be04bdae4d60f49c32ce0c24d8723fd0ea8eb1b02180"} Feb 17 14:23:21 crc kubenswrapper[4804]: I0217 14:23:21.020382 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4qm/must-gather-49hd6" event={"ID":"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a","Type":"ContainerStarted","Data":"ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336"} Feb 17 14:23:22 crc kubenswrapper[4804]: I0217 14:23:22.033967 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4qm/must-gather-49hd6" event={"ID":"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a","Type":"ContainerStarted","Data":"7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e"} Feb 17 14:23:22 crc kubenswrapper[4804]: I0217 14:23:22.060085 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4k4qm/must-gather-49hd6" podStartSLOduration=3.225402457 podStartE2EDuration="9.060070771s" podCreationTimestamp="2026-02-17 14:23:13 +0000 UTC" firstStartedPulling="2026-02-17 14:23:14.774407057 +0000 UTC m=+3468.885826394" lastFinishedPulling="2026-02-17 14:23:20.609075371 +0000 UTC m=+3474.720494708" observedRunningTime="2026-02-17 14:23:22.053248367 +0000 UTC m=+3476.164667714" watchObservedRunningTime="2026-02-17 14:23:22.060070771 +0000 UTC m=+3476.171490108" Feb 17 14:23:24 crc kubenswrapper[4804]: I0217 14:23:24.591050 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4k4qm/crc-debug-pdlxg"] Feb 17 14:23:24 crc kubenswrapper[4804]: I0217 14:23:24.593022 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" Feb 17 14:23:24 crc kubenswrapper[4804]: I0217 14:23:24.778224 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp2dz\" (UniqueName: \"kubernetes.io/projected/a166cb3c-985b-42bb-943e-5135d68d5827-kube-api-access-qp2dz\") pod \"crc-debug-pdlxg\" (UID: \"a166cb3c-985b-42bb-943e-5135d68d5827\") " pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" Feb 17 14:23:24 crc kubenswrapper[4804]: I0217 14:23:24.778579 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a166cb3c-985b-42bb-943e-5135d68d5827-host\") pod \"crc-debug-pdlxg\" (UID: \"a166cb3c-985b-42bb-943e-5135d68d5827\") " pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" Feb 17 14:23:24 crc kubenswrapper[4804]: I0217 14:23:24.880105 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a166cb3c-985b-42bb-943e-5135d68d5827-host\") pod \"crc-debug-pdlxg\" (UID: \"a166cb3c-985b-42bb-943e-5135d68d5827\") " pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" Feb 17 14:23:24 crc kubenswrapper[4804]: I0217 14:23:24.880242 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a166cb3c-985b-42bb-943e-5135d68d5827-host\") pod \"crc-debug-pdlxg\" (UID: \"a166cb3c-985b-42bb-943e-5135d68d5827\") " pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" Feb 17 14:23:24 crc kubenswrapper[4804]: I0217 14:23:24.880298 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp2dz\" (UniqueName: \"kubernetes.io/projected/a166cb3c-985b-42bb-943e-5135d68d5827-kube-api-access-qp2dz\") pod \"crc-debug-pdlxg\" (UID: \"a166cb3c-985b-42bb-943e-5135d68d5827\") " pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" Feb 17 14:23:24 crc kubenswrapper[4804]: I0217 14:23:24.912100 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp2dz\" (UniqueName: \"kubernetes.io/projected/a166cb3c-985b-42bb-943e-5135d68d5827-kube-api-access-qp2dz\") pod \"crc-debug-pdlxg\" (UID: \"a166cb3c-985b-42bb-943e-5135d68d5827\") " pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" Feb 17 14:23:25 crc kubenswrapper[4804]: I0217 14:23:25.209409 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" Feb 17 14:23:26 crc kubenswrapper[4804]: I0217 14:23:26.091321 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" event={"ID":"a166cb3c-985b-42bb-943e-5135d68d5827","Type":"ContainerStarted","Data":"9635daeefc377d00a326a2faedd3da9b6967b384943c47b23979e8593277e41f"} Feb 17 14:23:37 crc kubenswrapper[4804]: I0217 14:23:37.196179 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" event={"ID":"a166cb3c-985b-42bb-943e-5135d68d5827","Type":"ContainerStarted","Data":"937494af4bed6ea8a3c15ac225b0965ab5ea21328dc383082b45d9d125e0f418"} Feb 17 14:23:37 crc kubenswrapper[4804]: I0217 14:23:37.216417 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" podStartSLOduration=2.221199435 podStartE2EDuration="13.216400325s" podCreationTimestamp="2026-02-17 14:23:24 +0000 UTC" firstStartedPulling="2026-02-17 14:23:25.245342625 +0000 UTC m=+3479.356761962" lastFinishedPulling="2026-02-17 14:23:36.240543515 +0000 UTC m=+3490.351962852" observedRunningTime="2026-02-17 14:23:37.208714634 +0000 UTC m=+3491.320133971" watchObservedRunningTime="2026-02-17 14:23:37.216400325 +0000 UTC m=+3491.327819662" Feb 17 14:24:15 crc kubenswrapper[4804]: I0217 14:24:15.543049 4804 generic.go:334] "Generic (PLEG): container finished" podID="a166cb3c-985b-42bb-943e-5135d68d5827" containerID="937494af4bed6ea8a3c15ac225b0965ab5ea21328dc383082b45d9d125e0f418" exitCode=0 Feb 17 14:24:15 crc kubenswrapper[4804]: I0217 14:24:15.543124 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" event={"ID":"a166cb3c-985b-42bb-943e-5135d68d5827","Type":"ContainerDied","Data":"937494af4bed6ea8a3c15ac225b0965ab5ea21328dc383082b45d9d125e0f418"} Feb 17 14:24:16 crc kubenswrapper[4804]: I0217 14:24:16.666184 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" Feb 17 14:24:16 crc kubenswrapper[4804]: I0217 14:24:16.699524 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4k4qm/crc-debug-pdlxg"] Feb 17 14:24:16 crc kubenswrapper[4804]: I0217 14:24:16.708118 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4k4qm/crc-debug-pdlxg"] Feb 17 14:24:16 crc kubenswrapper[4804]: I0217 14:24:16.715593 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp2dz\" (UniqueName: \"kubernetes.io/projected/a166cb3c-985b-42bb-943e-5135d68d5827-kube-api-access-qp2dz\") pod \"a166cb3c-985b-42bb-943e-5135d68d5827\" (UID: \"a166cb3c-985b-42bb-943e-5135d68d5827\") " Feb 17 14:24:16 crc kubenswrapper[4804]: I0217 14:24:16.715668 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a166cb3c-985b-42bb-943e-5135d68d5827-host\") pod \"a166cb3c-985b-42bb-943e-5135d68d5827\" (UID: \"a166cb3c-985b-42bb-943e-5135d68d5827\") " Feb 17 14:24:16 crc kubenswrapper[4804]: I0217 14:24:16.715757 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a166cb3c-985b-42bb-943e-5135d68d5827-host" (OuterVolumeSpecName: "host") pod "a166cb3c-985b-42bb-943e-5135d68d5827" (UID: "a166cb3c-985b-42bb-943e-5135d68d5827"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:24:16 crc kubenswrapper[4804]: I0217 14:24:16.716136 4804 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a166cb3c-985b-42bb-943e-5135d68d5827-host\") on node \"crc\" DevicePath \"\"" Feb 17 14:24:16 crc kubenswrapper[4804]: I0217 14:24:16.739413 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a166cb3c-985b-42bb-943e-5135d68d5827-kube-api-access-qp2dz" (OuterVolumeSpecName: "kube-api-access-qp2dz") pod "a166cb3c-985b-42bb-943e-5135d68d5827" (UID: "a166cb3c-985b-42bb-943e-5135d68d5827"). InnerVolumeSpecName "kube-api-access-qp2dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:24:16 crc kubenswrapper[4804]: I0217 14:24:16.818247 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp2dz\" (UniqueName: \"kubernetes.io/projected/a166cb3c-985b-42bb-943e-5135d68d5827-kube-api-access-qp2dz\") on node \"crc\" DevicePath \"\"" Feb 17 14:24:17 crc kubenswrapper[4804]: I0217 14:24:17.562261 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9635daeefc377d00a326a2faedd3da9b6967b384943c47b23979e8593277e41f" Feb 17 14:24:17 crc kubenswrapper[4804]: I0217 14:24:17.562313 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" Feb 17 14:24:17 crc kubenswrapper[4804]: E0217 14:24:17.646859 4804 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda166cb3c_985b_42bb_943e_5135d68d5827.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda166cb3c_985b_42bb_943e_5135d68d5827.slice/crio-9635daeefc377d00a326a2faedd3da9b6967b384943c47b23979e8593277e41f\": RecentStats: unable to find data in memory cache]" Feb 17 14:24:17 crc kubenswrapper[4804]: I0217 14:24:17.889388 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4k4qm/crc-debug-75j4k"] Feb 17 14:24:17 crc kubenswrapper[4804]: E0217 14:24:17.889764 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a166cb3c-985b-42bb-943e-5135d68d5827" containerName="container-00" Feb 17 14:24:17 crc kubenswrapper[4804]: I0217 14:24:17.889777 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="a166cb3c-985b-42bb-943e-5135d68d5827" containerName="container-00" Feb 17 14:24:17 crc kubenswrapper[4804]: I0217 14:24:17.889973 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="a166cb3c-985b-42bb-943e-5135d68d5827" containerName="container-00" Feb 17 14:24:17 crc kubenswrapper[4804]: I0217 14:24:17.890612 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-75j4k" Feb 17 14:24:17 crc kubenswrapper[4804]: I0217 14:24:17.938789 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkjfb\" (UniqueName: \"kubernetes.io/projected/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-kube-api-access-qkjfb\") pod \"crc-debug-75j4k\" (UID: \"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1\") " pod="openshift-must-gather-4k4qm/crc-debug-75j4k" Feb 17 14:24:17 crc kubenswrapper[4804]: I0217 14:24:17.939338 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-host\") pod \"crc-debug-75j4k\" (UID: \"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1\") " pod="openshift-must-gather-4k4qm/crc-debug-75j4k" Feb 17 14:24:18 crc kubenswrapper[4804]: I0217 14:24:18.041500 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkjfb\" (UniqueName: \"kubernetes.io/projected/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-kube-api-access-qkjfb\") pod \"crc-debug-75j4k\" (UID: \"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1\") " pod="openshift-must-gather-4k4qm/crc-debug-75j4k" Feb 17 14:24:18 crc kubenswrapper[4804]: I0217 14:24:18.041812 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-host\") pod \"crc-debug-75j4k\" (UID: \"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1\") " pod="openshift-must-gather-4k4qm/crc-debug-75j4k" Feb 17 14:24:18 crc kubenswrapper[4804]: I0217 14:24:18.041893 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-host\") pod \"crc-debug-75j4k\" (UID: \"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1\") " pod="openshift-must-gather-4k4qm/crc-debug-75j4k" Feb 17 14:24:18 crc kubenswrapper[4804]: I0217 14:24:18.071338 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkjfb\" (UniqueName: \"kubernetes.io/projected/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-kube-api-access-qkjfb\") pod \"crc-debug-75j4k\" (UID: \"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1\") " pod="openshift-must-gather-4k4qm/crc-debug-75j4k" Feb 17 14:24:18 crc kubenswrapper[4804]: I0217 14:24:18.209298 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-75j4k" Feb 17 14:24:18 crc kubenswrapper[4804]: I0217 14:24:18.576379 4804 generic.go:334] "Generic (PLEG): container finished" podID="812cc376-c0d4-45d6-9eb0-3500f3bb0ac1" containerID="c9976e937dce8ae35118888d70c9c2b90975535717d1cc3679c3c08b380920c6" exitCode=0 Feb 17 14:24:18 crc kubenswrapper[4804]: I0217 14:24:18.591716 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a166cb3c-985b-42bb-943e-5135d68d5827" path="/var/lib/kubelet/pods/a166cb3c-985b-42bb-943e-5135d68d5827/volumes" Feb 17 14:24:18 crc kubenswrapper[4804]: I0217 14:24:18.592248 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4qm/crc-debug-75j4k" event={"ID":"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1","Type":"ContainerDied","Data":"c9976e937dce8ae35118888d70c9c2b90975535717d1cc3679c3c08b380920c6"} Feb 17 14:24:18 crc kubenswrapper[4804]: I0217 14:24:18.592288 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4qm/crc-debug-75j4k" event={"ID":"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1","Type":"ContainerStarted","Data":"92ee573375d647aa71e22c38c26fe16fd05f0f9bc3b131e96b1189ce81afbe11"} Feb 17 14:24:19 crc kubenswrapper[4804]: I0217 14:24:19.023123 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4k4qm/crc-debug-75j4k"] Feb 17 14:24:19 crc kubenswrapper[4804]: I0217 14:24:19.031351 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4k4qm/crc-debug-75j4k"] Feb 17 14:24:19 crc kubenswrapper[4804]: I0217 14:24:19.679639 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-75j4k" Feb 17 14:24:19 crc kubenswrapper[4804]: I0217 14:24:19.782065 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkjfb\" (UniqueName: \"kubernetes.io/projected/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-kube-api-access-qkjfb\") pod \"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1\" (UID: \"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1\") " Feb 17 14:24:19 crc kubenswrapper[4804]: I0217 14:24:19.782169 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-host\") pod \"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1\" (UID: \"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1\") " Feb 17 14:24:19 crc kubenswrapper[4804]: I0217 14:24:19.782389 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-host" (OuterVolumeSpecName: "host") pod "812cc376-c0d4-45d6-9eb0-3500f3bb0ac1" (UID: "812cc376-c0d4-45d6-9eb0-3500f3bb0ac1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:24:19 crc kubenswrapper[4804]: I0217 14:24:19.782943 4804 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-host\") on node \"crc\" DevicePath \"\"" Feb 17 14:24:19 crc kubenswrapper[4804]: I0217 14:24:19.789360 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-kube-api-access-qkjfb" (OuterVolumeSpecName: "kube-api-access-qkjfb") pod "812cc376-c0d4-45d6-9eb0-3500f3bb0ac1" (UID: "812cc376-c0d4-45d6-9eb0-3500f3bb0ac1"). InnerVolumeSpecName "kube-api-access-qkjfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:24:19 crc kubenswrapper[4804]: I0217 14:24:19.884995 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkjfb\" (UniqueName: \"kubernetes.io/projected/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-kube-api-access-qkjfb\") on node \"crc\" DevicePath \"\"" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.184632 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4k4qm/crc-debug-5h27q"] Feb 17 14:24:20 crc kubenswrapper[4804]: E0217 14:24:20.184986 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="812cc376-c0d4-45d6-9eb0-3500f3bb0ac1" containerName="container-00" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.184998 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="812cc376-c0d4-45d6-9eb0-3500f3bb0ac1" containerName="container-00" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.185175 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="812cc376-c0d4-45d6-9eb0-3500f3bb0ac1" containerName="container-00" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.185702 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-5h27q" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.292871 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-host\") pod \"crc-debug-5h27q\" (UID: \"ef953a43-c0ed-40e6-9cdc-9fe7596564d5\") " pod="openshift-must-gather-4k4qm/crc-debug-5h27q" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.293505 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5r9d\" (UniqueName: \"kubernetes.io/projected/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-kube-api-access-j5r9d\") pod \"crc-debug-5h27q\" (UID: \"ef953a43-c0ed-40e6-9cdc-9fe7596564d5\") " pod="openshift-must-gather-4k4qm/crc-debug-5h27q" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.396055 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-host\") pod \"crc-debug-5h27q\" (UID: \"ef953a43-c0ed-40e6-9cdc-9fe7596564d5\") " pod="openshift-must-gather-4k4qm/crc-debug-5h27q" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.396157 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5r9d\" (UniqueName: \"kubernetes.io/projected/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-kube-api-access-j5r9d\") pod \"crc-debug-5h27q\" (UID: \"ef953a43-c0ed-40e6-9cdc-9fe7596564d5\") " pod="openshift-must-gather-4k4qm/crc-debug-5h27q" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.396361 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-host\") pod \"crc-debug-5h27q\" (UID: \"ef953a43-c0ed-40e6-9cdc-9fe7596564d5\") " pod="openshift-must-gather-4k4qm/crc-debug-5h27q" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.412665 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5r9d\" (UniqueName: \"kubernetes.io/projected/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-kube-api-access-j5r9d\") pod \"crc-debug-5h27q\" (UID: \"ef953a43-c0ed-40e6-9cdc-9fe7596564d5\") " pod="openshift-must-gather-4k4qm/crc-debug-5h27q" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.502359 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-5h27q" Feb 17 14:24:20 crc kubenswrapper[4804]: W0217 14:24:20.530077 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef953a43_c0ed_40e6_9cdc_9fe7596564d5.slice/crio-b76ccb7c3019aa9ed072f4cec43e774660c7cfa815cf23720109110b270b4324 WatchSource:0}: Error finding container b76ccb7c3019aa9ed072f4cec43e774660c7cfa815cf23720109110b270b4324: Status 404 returned error can't find the container with id b76ccb7c3019aa9ed072f4cec43e774660c7cfa815cf23720109110b270b4324 Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.598312 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="812cc376-c0d4-45d6-9eb0-3500f3bb0ac1" path="/var/lib/kubelet/pods/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1/volumes" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.602398 4804 scope.go:117] "RemoveContainer" containerID="c9976e937dce8ae35118888d70c9c2b90975535717d1cc3679c3c08b380920c6" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.602550 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-75j4k" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.610479 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4qm/crc-debug-5h27q" event={"ID":"ef953a43-c0ed-40e6-9cdc-9fe7596564d5","Type":"ContainerStarted","Data":"b76ccb7c3019aa9ed072f4cec43e774660c7cfa815cf23720109110b270b4324"} Feb 17 14:24:21 crc kubenswrapper[4804]: I0217 14:24:21.622316 4804 generic.go:334] "Generic (PLEG): container finished" podID="ef953a43-c0ed-40e6-9cdc-9fe7596564d5" containerID="a5e0d360afd08e835a1b932952aecfeb3e5f4e1f58b1f3c9f05af31078c78de7" exitCode=0 Feb 17 14:24:21 crc kubenswrapper[4804]: I0217 14:24:21.622427 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4qm/crc-debug-5h27q" event={"ID":"ef953a43-c0ed-40e6-9cdc-9fe7596564d5","Type":"ContainerDied","Data":"a5e0d360afd08e835a1b932952aecfeb3e5f4e1f58b1f3c9f05af31078c78de7"} Feb 17 14:24:21 crc kubenswrapper[4804]: I0217 14:24:21.664997 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4k4qm/crc-debug-5h27q"] Feb 17 14:24:21 crc kubenswrapper[4804]: I0217 14:24:21.673244 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4k4qm/crc-debug-5h27q"] Feb 17 14:24:22 crc kubenswrapper[4804]: I0217 14:24:22.746589 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-5h27q" Feb 17 14:24:22 crc kubenswrapper[4804]: I0217 14:24:22.850593 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-host\") pod \"ef953a43-c0ed-40e6-9cdc-9fe7596564d5\" (UID: \"ef953a43-c0ed-40e6-9cdc-9fe7596564d5\") " Feb 17 14:24:22 crc kubenswrapper[4804]: I0217 14:24:22.850699 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5r9d\" (UniqueName: \"kubernetes.io/projected/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-kube-api-access-j5r9d\") pod \"ef953a43-c0ed-40e6-9cdc-9fe7596564d5\" (UID: \"ef953a43-c0ed-40e6-9cdc-9fe7596564d5\") " Feb 17 14:24:22 crc kubenswrapper[4804]: I0217 14:24:22.850732 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-host" (OuterVolumeSpecName: "host") pod "ef953a43-c0ed-40e6-9cdc-9fe7596564d5" (UID: "ef953a43-c0ed-40e6-9cdc-9fe7596564d5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:24:22 crc kubenswrapper[4804]: I0217 14:24:22.851056 4804 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-host\") on node \"crc\" DevicePath \"\"" Feb 17 14:24:22 crc kubenswrapper[4804]: I0217 14:24:22.863948 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-kube-api-access-j5r9d" (OuterVolumeSpecName: "kube-api-access-j5r9d") pod "ef953a43-c0ed-40e6-9cdc-9fe7596564d5" (UID: "ef953a43-c0ed-40e6-9cdc-9fe7596564d5"). InnerVolumeSpecName "kube-api-access-j5r9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:24:22 crc kubenswrapper[4804]: I0217 14:24:22.952961 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5r9d\" (UniqueName: \"kubernetes.io/projected/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-kube-api-access-j5r9d\") on node \"crc\" DevicePath \"\"" Feb 17 14:24:23 crc kubenswrapper[4804]: I0217 14:24:23.642223 4804 scope.go:117] "RemoveContainer" containerID="a5e0d360afd08e835a1b932952aecfeb3e5f4e1f58b1f3c9f05af31078c78de7" Feb 17 14:24:23 crc kubenswrapper[4804]: I0217 14:24:23.642277 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-5h27q" Feb 17 14:24:24 crc kubenswrapper[4804]: I0217 14:24:24.586250 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef953a43-c0ed-40e6-9cdc-9fe7596564d5" path="/var/lib/kubelet/pods/ef953a43-c0ed-40e6-9cdc-9fe7596564d5/volumes" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.744816 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fbh6b"] Feb 17 14:24:36 crc kubenswrapper[4804]: E0217 14:24:36.745730 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef953a43-c0ed-40e6-9cdc-9fe7596564d5" containerName="container-00" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.745743 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef953a43-c0ed-40e6-9cdc-9fe7596564d5" containerName="container-00" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.745936 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef953a43-c0ed-40e6-9cdc-9fe7596564d5" containerName="container-00" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.747166 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.761796 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fbh6b"] Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.816835 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-utilities\") pod \"community-operators-fbh6b\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.816991 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwxbs\" (UniqueName: \"kubernetes.io/projected/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-kube-api-access-gwxbs\") pod \"community-operators-fbh6b\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.817043 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-catalog-content\") pod \"community-operators-fbh6b\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.918286 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-utilities\") pod \"community-operators-fbh6b\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.918413 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwxbs\" (UniqueName: \"kubernetes.io/projected/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-kube-api-access-gwxbs\") pod \"community-operators-fbh6b\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.918443 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-catalog-content\") pod \"community-operators-fbh6b\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.918883 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-catalog-content\") pod \"community-operators-fbh6b\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.918894 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-utilities\") pod \"community-operators-fbh6b\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.946921 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwxbs\" (UniqueName: \"kubernetes.io/projected/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-kube-api-access-gwxbs\") pod \"community-operators-fbh6b\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:37 crc kubenswrapper[4804]: I0217 14:24:37.063841 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:37 crc kubenswrapper[4804]: I0217 14:24:37.621042 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fbh6b"] Feb 17 14:24:37 crc kubenswrapper[4804]: I0217 14:24:37.776836 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbh6b" event={"ID":"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9","Type":"ContainerStarted","Data":"e476be034a124bd37543c8e10bdc0da87c6541a84b1b15edb997e8e42215c690"} Feb 17 14:24:38 crc kubenswrapper[4804]: I0217 14:24:38.260622 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cc7c97fdd-bhd7w_2b89da32-9537-4c7b-a266-0d38ac52b069/barbican-api/0.log" Feb 17 14:24:38 crc kubenswrapper[4804]: I0217 14:24:38.415293 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cc7c97fdd-bhd7w_2b89da32-9537-4c7b-a266-0d38ac52b069/barbican-api-log/0.log" Feb 17 14:24:38 crc kubenswrapper[4804]: I0217 14:24:38.456095 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f46489f4-x24zj_297a0648-3cbd-4f1e-8bc4-d918a702c33b/barbican-keystone-listener/0.log" Feb 17 14:24:38 crc kubenswrapper[4804]: I0217 14:24:38.517833 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f46489f4-x24zj_297a0648-3cbd-4f1e-8bc4-d918a702c33b/barbican-keystone-listener-log/0.log" Feb 17 14:24:38 crc kubenswrapper[4804]: I0217 14:24:38.706491 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f97f9545f-tngcj_c7f4e4c3-9ec8-4923-bf7b-4058899e863f/barbican-worker/0.log" Feb 17 14:24:38 crc kubenswrapper[4804]: I0217 14:24:38.728156 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f97f9545f-tngcj_c7f4e4c3-9ec8-4923-bf7b-4058899e863f/barbican-worker-log/0.log" Feb 17 14:24:38 crc kubenswrapper[4804]: I0217 14:24:38.787338 4804 generic.go:334] "Generic (PLEG): container finished" podID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" containerID="0ee7a09a9548612157cd0499cb44d44f1bc48d1f96bb83d6a4a27122857dc493" exitCode=0 Feb 17 14:24:38 crc kubenswrapper[4804]: I0217 14:24:38.787383 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbh6b" event={"ID":"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9","Type":"ContainerDied","Data":"0ee7a09a9548612157cd0499cb44d44f1bc48d1f96bb83d6a4a27122857dc493"} Feb 17 14:24:38 crc kubenswrapper[4804]: I0217 14:24:38.884495 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p_9ee075c2-2363-4446-8545-dfdece6ca4da/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:38 crc kubenswrapper[4804]: I0217 14:24:38.988612 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_39bfc426-b9af-40b4-a713-26bb2366db7a/ceilometer-central-agent/0.log" Feb 17 14:24:39 crc kubenswrapper[4804]: I0217 14:24:39.033750 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_39bfc426-b9af-40b4-a713-26bb2366db7a/ceilometer-notification-agent/0.log" Feb 17 14:24:39 crc kubenswrapper[4804]: I0217 14:24:39.101966 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_39bfc426-b9af-40b4-a713-26bb2366db7a/proxy-httpd/0.log" Feb 17 14:24:39 crc kubenswrapper[4804]: I0217 14:24:39.132112 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_39bfc426-b9af-40b4-a713-26bb2366db7a/sg-core/0.log" Feb 17 14:24:39 crc kubenswrapper[4804]: I0217 14:24:39.301849 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92/cinder-api/0.log" Feb 17 14:24:39 crc kubenswrapper[4804]: I0217 14:24:39.318406 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92/cinder-api-log/0.log" Feb 17 14:24:39 crc kubenswrapper[4804]: I0217 14:24:39.459299 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f7170af0-a08f-4b96-b93a-5353d633a82f/cinder-scheduler/0.log" Feb 17 14:24:39 crc kubenswrapper[4804]: I0217 14:24:39.603148 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-499xq_5c4e88aa-842f-453a-9ce9-8354c16340e9/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:39 crc kubenswrapper[4804]: I0217 14:24:39.616855 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f7170af0-a08f-4b96-b93a-5353d633a82f/probe/0.log" Feb 17 14:24:39 crc kubenswrapper[4804]: I0217 14:24:39.798523 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbh6b" event={"ID":"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9","Type":"ContainerStarted","Data":"da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df"} Feb 17 14:24:39 crc kubenswrapper[4804]: I0217 14:24:39.832516 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-2n5kn_69619ab8-5a40-43b9-8e9c-1a6e39893605/init/0.log" Feb 17 14:24:39 crc kubenswrapper[4804]: I0217 14:24:39.892060 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq_5ca70007-e938-4bd5-9f2a-66f18b87743a/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:40 crc kubenswrapper[4804]: I0217 14:24:40.071046 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-2n5kn_69619ab8-5a40-43b9-8e9c-1a6e39893605/dnsmasq-dns/0.log" Feb 17 14:24:40 crc kubenswrapper[4804]: I0217 14:24:40.104261 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-2n5kn_69619ab8-5a40-43b9-8e9c-1a6e39893605/init/0.log" Feb 17 14:24:40 crc kubenswrapper[4804]: I0217 14:24:40.116482 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc_5ecc3e55-21c0-4017-8dce-9c77fd2189ea/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:40 crc kubenswrapper[4804]: I0217 14:24:40.304947 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cc2e7136-825b-4608-a106-944f359c7369/glance-log/0.log" Feb 17 14:24:40 crc kubenswrapper[4804]: I0217 14:24:40.500010 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_52f268a5-3c72-4655-bb36-823c34e5312d/glance-log/0.log" Feb 17 14:24:40 crc kubenswrapper[4804]: I0217 14:24:40.622600 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cc2e7136-825b-4608-a106-944f359c7369/glance-httpd/0.log" Feb 17 14:24:40 crc kubenswrapper[4804]: I0217 14:24:40.625636 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_52f268a5-3c72-4655-bb36-823c34e5312d/glance-httpd/0.log" Feb 17 14:24:40 crc kubenswrapper[4804]: I0217 14:24:40.732014 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-9ffb6f5c6-fczv5_e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f/horizon/0.log" Feb 17 14:24:40 crc kubenswrapper[4804]: I0217 14:24:40.807604 4804 generic.go:334] "Generic (PLEG): container finished" podID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" containerID="da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df" exitCode=0 Feb 17 14:24:40 crc kubenswrapper[4804]: I0217 14:24:40.807649 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbh6b" event={"ID":"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9","Type":"ContainerDied","Data":"da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df"} Feb 17 14:24:40 crc kubenswrapper[4804]: I0217 14:24:40.818922 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-65nc8_0a55b597-4920-4fa6-99d5-6deaa6f30a4a/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:41 crc kubenswrapper[4804]: I0217 14:24:41.019114 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-9ffb6f5c6-fczv5_e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f/horizon-log/0.log" Feb 17 14:24:41 crc kubenswrapper[4804]: I0217 14:24:41.090873 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-hx4nm_e9b53a85-8a87-4b65-8832-00c4175da541/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:41 crc kubenswrapper[4804]: I0217 14:24:41.369849 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-9cc757857-wng6k_30df70d3-9323-4ddd-9d1c-2dae72cff6d9/keystone-api/0.log" Feb 17 14:24:41 crc kubenswrapper[4804]: I0217 14:24:41.404455 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29522281-k9ptv_c2d1f319-5d08-4969-a968-45eba20958a7/keystone-cron/0.log" Feb 17 14:24:41 crc kubenswrapper[4804]: I0217 14:24:41.543349 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_d6aabf20-b0bf-4f35-aec7-098f38bacfd9/kube-state-metrics/0.log" Feb 17 14:24:41 crc kubenswrapper[4804]: I0217 14:24:41.685772 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc_c0aad2ba-98cf-42b5-9c03-40633fb8ac18/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:41 crc kubenswrapper[4804]: I0217 14:24:41.817847 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbh6b" event={"ID":"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9","Type":"ContainerStarted","Data":"f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431"} Feb 17 14:24:41 crc kubenswrapper[4804]: I0217 14:24:41.844235 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fbh6b" podStartSLOduration=3.427210233 podStartE2EDuration="5.844211475s" podCreationTimestamp="2026-02-17 14:24:36 +0000 UTC" firstStartedPulling="2026-02-17 14:24:38.789549457 +0000 UTC m=+3552.900968794" lastFinishedPulling="2026-02-17 14:24:41.206550699 +0000 UTC m=+3555.317970036" observedRunningTime="2026-02-17 14:24:41.841574282 +0000 UTC m=+3555.952993619" watchObservedRunningTime="2026-02-17 14:24:41.844211475 +0000 UTC m=+3555.955630822" Feb 17 14:24:42 crc kubenswrapper[4804]: I0217 14:24:42.051580 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-c576cfd85-655nj_fb86b3d7-c6a3-43d5-a8da-805aa7d73a66/neutron-api/0.log" Feb 17 14:24:42 crc kubenswrapper[4804]: I0217 14:24:42.162781 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-c576cfd85-655nj_fb86b3d7-c6a3-43d5-a8da-805aa7d73a66/neutron-httpd/0.log" Feb 17 14:24:42 crc kubenswrapper[4804]: I0217 14:24:42.331715 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg_84938cd5-694c-423a-a0d1-801f28085377/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:43 crc kubenswrapper[4804]: I0217 14:24:43.156122 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_fc78e86d-494e-417b-8569-b564cdbd069a/nova-cell0-conductor-conductor/0.log" Feb 17 14:24:43 crc kubenswrapper[4804]: I0217 14:24:43.162045 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_29528202-42d5-4bcd-90e8-335435ba59cf/nova-api-log/0.log" Feb 17 14:24:43 crc kubenswrapper[4804]: I0217 14:24:43.319039 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_29528202-42d5-4bcd-90e8-335435ba59cf/nova-api-api/0.log" Feb 17 14:24:43 crc kubenswrapper[4804]: I0217 14:24:43.467069 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a13dbc73-75fc-448b-af44-cb7018d1640e/nova-cell1-conductor-conductor/0.log" Feb 17 14:24:43 crc kubenswrapper[4804]: I0217 14:24:43.564696 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_5c380610-c164-4798-a5df-9b90fd475667/nova-cell1-novncproxy-novncproxy/0.log" Feb 17 14:24:43 crc kubenswrapper[4804]: I0217 14:24:43.954907 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-x8lml_9f17dd92-0402-40c7-bdc7-50b38e37f750/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:44 crc kubenswrapper[4804]: I0217 14:24:44.021516 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ee4c15c1-5fb0-4605-9cb8-69a060ec0d39/nova-metadata-log/0.log" Feb 17 14:24:44 crc kubenswrapper[4804]: I0217 14:24:44.382664 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_1bac289d-58a7-4e23-8805-c48811d12d32/nova-scheduler-scheduler/0.log" Feb 17 14:24:44 crc kubenswrapper[4804]: I0217 14:24:44.395982 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f9eb8e8f-8bd1-4f69-84ee-27213046c709/mysql-bootstrap/0.log" Feb 17 14:24:44 crc kubenswrapper[4804]: I0217 14:24:44.589843 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f9eb8e8f-8bd1-4f69-84ee-27213046c709/mysql-bootstrap/0.log" Feb 17 14:24:44 crc kubenswrapper[4804]: I0217 14:24:44.644725 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f9eb8e8f-8bd1-4f69-84ee-27213046c709/galera/0.log" Feb 17 14:24:44 crc kubenswrapper[4804]: I0217 14:24:44.799075 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_49b02c8f-ff07-48f9-8012-e78dc6591499/mysql-bootstrap/0.log" Feb 17 14:24:44 crc kubenswrapper[4804]: I0217 14:24:44.985605 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_49b02c8f-ff07-48f9-8012-e78dc6591499/mysql-bootstrap/0.log" Feb 17 14:24:45 crc kubenswrapper[4804]: I0217 14:24:45.032794 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_49b02c8f-ff07-48f9-8012-e78dc6591499/galera/0.log" Feb 17 14:24:45 crc kubenswrapper[4804]: I0217 14:24:45.048568 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ee4c15c1-5fb0-4605-9cb8-69a060ec0d39/nova-metadata-metadata/0.log" Feb 17 14:24:45 crc kubenswrapper[4804]: I0217 14:24:45.194563 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_de1a53e3-68ce-4ecd-9c0a-80ffce568891/openstackclient/0.log" Feb 17 14:24:45 crc kubenswrapper[4804]: I0217 14:24:45.284014 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-4s7l5_d286aa08-b0df-44e8-9128-f596f4b44db8/openstack-network-exporter/0.log" Feb 17 14:24:45 crc kubenswrapper[4804]: I0217 14:24:45.417820 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p4wrm_45330d20-989c-4507-ae57-5beaee075484/ovsdb-server-init/0.log" Feb 17 14:24:45 crc kubenswrapper[4804]: I0217 14:24:45.654324 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p4wrm_45330d20-989c-4507-ae57-5beaee075484/ovs-vswitchd/0.log" Feb 17 14:24:45 crc kubenswrapper[4804]: I0217 14:24:45.709692 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p4wrm_45330d20-989c-4507-ae57-5beaee075484/ovsdb-server/0.log" Feb 17 14:24:45 crc kubenswrapper[4804]: I0217 14:24:45.757378 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p4wrm_45330d20-989c-4507-ae57-5beaee075484/ovsdb-server-init/0.log" Feb 17 14:24:45 crc kubenswrapper[4804]: I0217 14:24:45.920547 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-rzcfd_9c049787-03d2-4679-8705-ec2cd1ad8141/ovn-controller/0.log" Feb 17 14:24:45 crc kubenswrapper[4804]: I0217 14:24:45.932910 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-v478m_be98213b-0510-4f69-9d98-81363c04d8bd/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:46 crc kubenswrapper[4804]: I0217 14:24:46.093214 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3e322ccb-33cf-466f-91fb-63781bdcffb6/openstack-network-exporter/0.log" Feb 17 14:24:46 crc kubenswrapper[4804]: I0217 14:24:46.167413 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3e322ccb-33cf-466f-91fb-63781bdcffb6/ovn-northd/0.log" Feb 17 14:24:46 crc kubenswrapper[4804]: I0217 14:24:46.239019 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0fc5c8da-b323-4afb-aa47-125fc63caefd/openstack-network-exporter/0.log" Feb 17 14:24:46 crc kubenswrapper[4804]: I0217 14:24:46.292769 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0fc5c8da-b323-4afb-aa47-125fc63caefd/ovsdbserver-nb/0.log" Feb 17 14:24:46 crc kubenswrapper[4804]: I0217 14:24:46.454490 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_10e1124a-f402-422d-a906-8d22c90d4abe/openstack-network-exporter/0.log" Feb 17 14:24:46 crc kubenswrapper[4804]: I0217 14:24:46.467115 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_10e1124a-f402-422d-a906-8d22c90d4abe/ovsdbserver-sb/0.log" Feb 17 14:24:46 crc kubenswrapper[4804]: I0217 14:24:46.696621 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d69649784-lnwhw_858d67cb-268b-4724-bba9-a7ab9a10ed6c/placement-api/0.log" Feb 17 14:24:46 crc kubenswrapper[4804]: I0217 14:24:46.798974 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4c7ecd09-cd15-439d-9153-b55d9013bb83/setup-container/0.log" Feb 17 14:24:46 crc kubenswrapper[4804]: I0217 14:24:46.800060 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d69649784-lnwhw_858d67cb-268b-4724-bba9-a7ab9a10ed6c/placement-log/0.log" Feb 17 14:24:46 crc kubenswrapper[4804]: I0217 14:24:46.965211 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4c7ecd09-cd15-439d-9153-b55d9013bb83/setup-container/0.log" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.012931 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b5f204e4-3b7a-4490-9c78-def5bf30f810/setup-container/0.log" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.045237 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4c7ecd09-cd15-439d-9153-b55d9013bb83/rabbitmq/0.log" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.065261 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.065448 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.120767 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.252401 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b5f204e4-3b7a-4490-9c78-def5bf30f810/setup-container/0.log" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.262083 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66_100d84c5-396c-4772-af09-2e223e72a640/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.270682 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b5f204e4-3b7a-4490-9c78-def5bf30f810/rabbitmq/0.log" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.473435 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-z6s9f_c87b0376-c505-452b-90ed-0e6bb7e6e8e0/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.525908 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-zctst_ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.690626 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-rf97c_01fe0e44-6604-4e17-bcb4-05f202508fc7/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.729854 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-9jrnh_cdb9b3eb-f3d1-4a32-8a87-b0f686cad260/ssh-known-hosts-edpm-deployment/0.log" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.921530 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.970085 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-59cfdfc65f-48l6n_be0372d3-4646-46e7-af04-6977a7426f35/proxy-server/0.log" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.980522 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fbh6b"] Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.046800 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-59cfdfc65f-48l6n_be0372d3-4646-46e7-af04-6977a7426f35/proxy-httpd/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.194424 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mv8w5_41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2/swift-ring-rebalance/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.228302 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/account-reaper/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.278027 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/account-auditor/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.435647 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/account-replicator/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.448539 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/account-server/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.509947 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/container-auditor/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.543169 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/container-replicator/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.635452 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/container-updater/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.656926 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/container-server/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.779572 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/object-auditor/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.825176 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/object-expirer/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.900337 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/object-replicator/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.981522 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/object-server/0.log" Feb 17 14:24:49 crc kubenswrapper[4804]: I0217 14:24:49.028836 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/object-updater/0.log" Feb 17 14:24:49 crc kubenswrapper[4804]: I0217 14:24:49.032439 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/rsync/0.log" Feb 17 14:24:49 crc kubenswrapper[4804]: I0217 14:24:49.146458 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/swift-recon-cron/0.log" Feb 17 14:24:49 crc kubenswrapper[4804]: I0217 14:24:49.304848 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-wtq55_0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:49 crc kubenswrapper[4804]: I0217 14:24:49.386008 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_f7b246dc-1d07-4725-b471-88fe82584d24/tempest-tests-tempest-tests-runner/0.log" Feb 17 14:24:49 crc kubenswrapper[4804]: I0217 14:24:49.527669 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_4c6dcbcb-8248-40b5-8fd6-7824c487109e/test-operator-logs-container/0.log" Feb 17 14:24:49 crc kubenswrapper[4804]: I0217 14:24:49.666855 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb_ed6642bc-b49f-4e17-a721-b3eae09246aa/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:49 crc kubenswrapper[4804]: I0217 14:24:49.881574 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fbh6b" podUID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" containerName="registry-server" containerID="cri-o://f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431" gracePeriod=2 Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.408388 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.566110 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-catalog-content\") pod \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.566307 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwxbs\" (UniqueName: \"kubernetes.io/projected/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-kube-api-access-gwxbs\") pod \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.566365 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-utilities\") pod \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.567394 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-utilities" (OuterVolumeSpecName: "utilities") pod "f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" (UID: "f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.573561 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-kube-api-access-gwxbs" (OuterVolumeSpecName: "kube-api-access-gwxbs") pod "f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" (UID: "f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9"). InnerVolumeSpecName "kube-api-access-gwxbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.619973 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" (UID: "f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.668483 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.668529 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwxbs\" (UniqueName: \"kubernetes.io/projected/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-kube-api-access-gwxbs\") on node \"crc\" DevicePath \"\"" Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.668545 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.695162 4804 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod812cc376-c0d4-45d6-9eb0-3500f3bb0ac1"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod812cc376-c0d4-45d6-9eb0-3500f3bb0ac1] : Timed out while waiting for systemd to remove kubepods-besteffort-pod812cc376_c0d4_45d6_9eb0_3500f3bb0ac1.slice" Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.893438 4804 generic.go:334] "Generic (PLEG): container finished" podID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" containerID="f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431" exitCode=0 Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.893477 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.893492 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbh6b" event={"ID":"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9","Type":"ContainerDied","Data":"f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431"} Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.893526 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbh6b" event={"ID":"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9","Type":"ContainerDied","Data":"e476be034a124bd37543c8e10bdc0da87c6541a84b1b15edb997e8e42215c690"} Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.893556 4804 scope.go:117] "RemoveContainer" containerID="f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431" Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.935336 4804 scope.go:117] "RemoveContainer" containerID="da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df" Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.966759 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fbh6b"] Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.979448 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fbh6b"] Feb 17 14:24:51 crc kubenswrapper[4804]: I0217 14:24:51.025439 4804 scope.go:117] "RemoveContainer" containerID="0ee7a09a9548612157cd0499cb44d44f1bc48d1f96bb83d6a4a27122857dc493" Feb 17 14:24:51 crc kubenswrapper[4804]: I0217 14:24:51.048395 4804 scope.go:117] "RemoveContainer" containerID="f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431" Feb 17 14:24:51 crc kubenswrapper[4804]: E0217 14:24:51.051407 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431\": container with ID starting with f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431 not found: ID does not exist" containerID="f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431" Feb 17 14:24:51 crc kubenswrapper[4804]: I0217 14:24:51.051463 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431"} err="failed to get container status \"f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431\": rpc error: code = NotFound desc = could not find container \"f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431\": container with ID starting with f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431 not found: ID does not exist" Feb 17 14:24:51 crc kubenswrapper[4804]: I0217 14:24:51.051512 4804 scope.go:117] "RemoveContainer" containerID="da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df" Feb 17 14:24:51 crc kubenswrapper[4804]: E0217 14:24:51.052718 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df\": container with ID starting with da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df not found: ID does not exist" containerID="da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df" Feb 17 14:24:51 crc kubenswrapper[4804]: I0217 14:24:51.052757 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df"} err="failed to get container status \"da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df\": rpc error: code = NotFound desc = could not find container \"da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df\": container with ID starting with da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df not found: ID does not exist" Feb 17 14:24:51 crc kubenswrapper[4804]: I0217 14:24:51.052785 4804 scope.go:117] "RemoveContainer" containerID="0ee7a09a9548612157cd0499cb44d44f1bc48d1f96bb83d6a4a27122857dc493" Feb 17 14:24:51 crc kubenswrapper[4804]: E0217 14:24:51.053604 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ee7a09a9548612157cd0499cb44d44f1bc48d1f96bb83d6a4a27122857dc493\": container with ID starting with 0ee7a09a9548612157cd0499cb44d44f1bc48d1f96bb83d6a4a27122857dc493 not found: ID does not exist" containerID="0ee7a09a9548612157cd0499cb44d44f1bc48d1f96bb83d6a4a27122857dc493" Feb 17 14:24:51 crc kubenswrapper[4804]: I0217 14:24:51.053663 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ee7a09a9548612157cd0499cb44d44f1bc48d1f96bb83d6a4a27122857dc493"} err="failed to get container status \"0ee7a09a9548612157cd0499cb44d44f1bc48d1f96bb83d6a4a27122857dc493\": rpc error: code = NotFound desc = could not find container \"0ee7a09a9548612157cd0499cb44d44f1bc48d1f96bb83d6a4a27122857dc493\": container with ID starting with 0ee7a09a9548612157cd0499cb44d44f1bc48d1f96bb83d6a4a27122857dc493 not found: ID does not exist" Feb 17 14:24:52 crc kubenswrapper[4804]: I0217 14:24:52.585524 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" path="/var/lib/kubelet/pods/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9/volumes" Feb 17 14:24:55 crc kubenswrapper[4804]: I0217 14:24:55.835260 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:24:55 crc kubenswrapper[4804]: I0217 14:24:55.835587 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:24:58 crc kubenswrapper[4804]: I0217 14:24:58.601283 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f5ef96d0-19a6-4561-bde2-cf38e0280b39/memcached/0.log" Feb 17 14:25:14 crc kubenswrapper[4804]: I0217 14:25:14.048666 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/util/0.log" Feb 17 14:25:14 crc kubenswrapper[4804]: I0217 14:25:14.251022 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/pull/0.log" Feb 17 14:25:14 crc kubenswrapper[4804]: I0217 14:25:14.253804 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/util/0.log" Feb 17 14:25:14 crc kubenswrapper[4804]: I0217 14:25:14.273854 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/pull/0.log" Feb 17 14:25:14 crc kubenswrapper[4804]: I0217 14:25:14.448598 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/util/0.log" Feb 17 14:25:14 crc kubenswrapper[4804]: I0217 14:25:14.452722 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/extract/0.log" Feb 17 14:25:14 crc kubenswrapper[4804]: I0217 14:25:14.487896 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/pull/0.log" Feb 17 14:25:15 crc kubenswrapper[4804]: I0217 14:25:15.072774 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-55cc45767f-bslfv_fbc5e6cd-47c6-4199-a0f2-e4292a836fac/manager/0.log" Feb 17 14:25:15 crc kubenswrapper[4804]: I0217 14:25:15.382684 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68c6d499cb-vt6zw_5796dc62-bd84-48b7-9c4c-7d5bf1f7e984/manager/0.log" Feb 17 14:25:15 crc kubenswrapper[4804]: I0217 14:25:15.577281 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-9595d6797-sxtr2_5727ae12-4720-4470-b5cc-8b8ae81c2af7/manager/0.log" Feb 17 14:25:15 crc kubenswrapper[4804]: I0217 14:25:15.818618 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54fb488b88-t6hlr_5fa66dc5-a518-40dd-a4b5-dd2b34425ad5/manager/0.log" Feb 17 14:25:15 crc kubenswrapper[4804]: I0217 14:25:15.996844 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-57746b5ff9-wn64m_0b746a42-c0b4-4cb9-9352-3623669bad5a/manager/0.log" Feb 17 14:25:16 crc kubenswrapper[4804]: I0217 14:25:16.186152 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6494cdbf8f-cdpkr_07b97973-fa08-4b79-9164-918a4d04f8b7/manager/0.log" Feb 17 14:25:16 crc kubenswrapper[4804]: I0217 14:25:16.358474 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-66d6b5f488-lrjgg_bf13099a-fbab-41bf-b30c-5c6b1049af19/manager/0.log" Feb 17 14:25:16 crc kubenswrapper[4804]: I0217 14:25:16.437083 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-6c78d668d5-pddsh_430279ab-ba2f-4838-ab07-b851d4df84a0/manager/0.log" Feb 17 14:25:16 crc kubenswrapper[4804]: I0217 14:25:16.552374 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-96fff9cb8-88sh4_d3332002-6930-418f-8288-e8344be70c6a/manager/0.log" Feb 17 14:25:16 crc kubenswrapper[4804]: I0217 14:25:16.686143 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-66997756f6-vkdg2_2546387a-6a42-4f8d-a321-2f9cbaa11adb/manager/0.log" Feb 17 14:25:16 crc kubenswrapper[4804]: I0217 14:25:16.953501 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54967dbbdf-l5cl2_97925efc-eb46-4a60-b372-b31f13a2c876/manager/0.log" Feb 17 14:25:17 crc kubenswrapper[4804]: I0217 14:25:17.101273 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5ddd85db87-c8hmm_36b1ca46-becb-417e-b05e-777d40246cb6/manager/0.log" Feb 17 14:25:17 crc kubenswrapper[4804]: I0217 14:25:17.417592 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88_ae7598b8-fff5-4044-bbd7-0c8f2f60eed8/manager/0.log" Feb 17 14:25:17 crc kubenswrapper[4804]: I0217 14:25:17.780247 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7cb8c4979f-kfx9x_f69fc148-3a8b-4065-b075-85ecad8339e7/operator/0.log" Feb 17 14:25:18 crc kubenswrapper[4804]: I0217 14:25:18.205548 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-55nc6_13d9e436-3cb0-4df0-aaf9-e614eba74c89/registry-server/0.log" Feb 17 14:25:18 crc kubenswrapper[4804]: I0217 14:25:18.570737 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-85c99d655-ltwrc_ac1e20c8-4527-4bba-85bd-2154e1244d3e/manager/0.log" Feb 17 14:25:18 crc kubenswrapper[4804]: I0217 14:25:18.651973 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-745bbbd77b-ptrs5_79eb8fb0-6207-44c8-b3c2-a00116bcf10b/manager/0.log" Feb 17 14:25:18 crc kubenswrapper[4804]: I0217 14:25:18.779831 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57bd55f9b7-9vbg5_42505b9c-f878-4feb-b9a1-9dfa11ec0f56/manager/0.log" Feb 17 14:25:18 crc kubenswrapper[4804]: I0217 14:25:18.847655 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-rtlpm_44ec973d-9403-48f4-b92c-72f0bd485b0f/operator/0.log" Feb 17 14:25:19 crc kubenswrapper[4804]: I0217 14:25:19.061224 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-79558bbfbf-n6fl9_f94e791f-16fd-4364-a246-35bcca0d14e6/manager/0.log" Feb 17 14:25:19 crc kubenswrapper[4804]: I0217 14:25:19.296518 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-8467ccb4c8-nwmk5_1c7ad838-6225-4001-899a-7f741cb75f2f/manager/0.log" Feb 17 14:25:19 crc kubenswrapper[4804]: I0217 14:25:19.325394 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-56dc67d744-rbrxl_067b67c8-64c5-4c21-b1b1-770aa68e0eb7/manager/0.log" Feb 17 14:25:19 crc kubenswrapper[4804]: I0217 14:25:19.494836 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c469bc6bb-xlwmb_57038414-fcca-4a2a-8756-46f97cc57d81/manager/0.log" Feb 17 14:25:19 crc kubenswrapper[4804]: I0217 14:25:19.719489 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5744df64c-mkkrv_8155784a-3945-4ca3-aa9a-b0e089ffac52/manager/0.log" Feb 17 14:25:21 crc kubenswrapper[4804]: I0217 14:25:21.519062 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-c4b7d6946-4xvfg_545c7d25-7774-4c62-89b8-f491fd4065e8/manager/0.log" Feb 17 14:25:25 crc kubenswrapper[4804]: I0217 14:25:25.835328 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:25:25 crc kubenswrapper[4804]: I0217 14:25:25.835942 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:25:39 crc kubenswrapper[4804]: I0217 14:25:39.814429 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-t4m4g_6c98dfab-f166-4eb4-b385-724d6b9b9d7a/control-plane-machine-set-operator/0.log" Feb 17 14:25:40 crc kubenswrapper[4804]: I0217 14:25:40.076831 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-spfls_17c8a131-fc0e-44b5-b374-846e6b2aeb1c/kube-rbac-proxy/0.log" Feb 17 14:25:40 crc kubenswrapper[4804]: I0217 14:25:40.088031 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-spfls_17c8a131-fc0e-44b5-b374-846e6b2aeb1c/machine-api-operator/0.log" Feb 17 14:25:53 crc kubenswrapper[4804]: I0217 14:25:53.041105 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-7sfkb_112c357f-f1dc-4a07-bba0-ddf54ab071ff/cert-manager-controller/0.log" Feb 17 14:25:53 crc kubenswrapper[4804]: I0217 14:25:53.244580 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-kbdz5_9d2d8008-6348-4f24-8085-d30db8558ab3/cert-manager-cainjector/0.log" Feb 17 14:25:53 crc kubenswrapper[4804]: I0217 14:25:53.311572 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-c8nh8_be70f757-4537-489d-a86e-a1b49fc9af75/cert-manager-webhook/0.log" Feb 17 14:25:55 crc kubenswrapper[4804]: I0217 14:25:55.834923 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:25:55 crc kubenswrapper[4804]: I0217 14:25:55.835272 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:25:55 crc kubenswrapper[4804]: I0217 14:25:55.835322 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 14:25:55 crc kubenswrapper[4804]: I0217 14:25:55.836126 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7d4a85eb120e127ec0b1aabea32973bfd4724fea056face9a5b718c636d4f49a"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:25:55 crc kubenswrapper[4804]: I0217 14:25:55.836190 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://7d4a85eb120e127ec0b1aabea32973bfd4724fea056face9a5b718c636d4f49a" gracePeriod=600 Feb 17 14:25:56 crc kubenswrapper[4804]: I0217 14:25:56.475172 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="7d4a85eb120e127ec0b1aabea32973bfd4724fea056face9a5b718c636d4f49a" exitCode=0 Feb 17 14:25:56 crc kubenswrapper[4804]: I0217 14:25:56.475232 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"7d4a85eb120e127ec0b1aabea32973bfd4724fea056face9a5b718c636d4f49a"} Feb 17 14:25:56 crc kubenswrapper[4804]: I0217 14:25:56.475580 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db"} Feb 17 14:25:56 crc kubenswrapper[4804]: I0217 14:25:56.475608 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:26:02 crc kubenswrapper[4804]: I0217 14:26:02.978468 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k4q2q"] Feb 17 14:26:02 crc kubenswrapper[4804]: E0217 14:26:02.979362 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" containerName="extract-content" Feb 17 14:26:02 crc kubenswrapper[4804]: I0217 14:26:02.979377 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" containerName="extract-content" Feb 17 14:26:02 crc kubenswrapper[4804]: E0217 14:26:02.979394 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" containerName="extract-utilities" Feb 17 14:26:02 crc kubenswrapper[4804]: I0217 14:26:02.979403 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" containerName="extract-utilities" Feb 17 14:26:02 crc kubenswrapper[4804]: E0217 14:26:02.979428 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" containerName="registry-server" Feb 17 14:26:02 crc kubenswrapper[4804]: I0217 14:26:02.979437 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" containerName="registry-server" Feb 17 14:26:02 crc kubenswrapper[4804]: I0217 14:26:02.979690 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" containerName="registry-server" Feb 17 14:26:02 crc kubenswrapper[4804]: I0217 14:26:02.981271 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.006066 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k4q2q"] Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.038954 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-catalog-content\") pod \"certified-operators-k4q2q\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.039064 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x2mh\" (UniqueName: \"kubernetes.io/projected/6f377127-aca7-4b36-976b-fdc21aadd31b-kube-api-access-5x2mh\") pod \"certified-operators-k4q2q\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.039116 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-utilities\") pod \"certified-operators-k4q2q\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.141086 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-catalog-content\") pod \"certified-operators-k4q2q\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.141213 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x2mh\" (UniqueName: \"kubernetes.io/projected/6f377127-aca7-4b36-976b-fdc21aadd31b-kube-api-access-5x2mh\") pod \"certified-operators-k4q2q\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.141280 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-utilities\") pod \"certified-operators-k4q2q\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.142158 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-catalog-content\") pod \"certified-operators-k4q2q\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.142485 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-utilities\") pod \"certified-operators-k4q2q\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.171335 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x2mh\" (UniqueName: \"kubernetes.io/projected/6f377127-aca7-4b36-976b-fdc21aadd31b-kube-api-access-5x2mh\") pod \"certified-operators-k4q2q\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.304435 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.874179 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k4q2q"] Feb 17 14:26:04 crc kubenswrapper[4804]: I0217 14:26:04.545967 4804 generic.go:334] "Generic (PLEG): container finished" podID="6f377127-aca7-4b36-976b-fdc21aadd31b" containerID="859966cd4b303ac29a91909a91abfa5313c62f12c4f835189bddd275801bedc7" exitCode=0 Feb 17 14:26:04 crc kubenswrapper[4804]: I0217 14:26:04.546020 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4q2q" event={"ID":"6f377127-aca7-4b36-976b-fdc21aadd31b","Type":"ContainerDied","Data":"859966cd4b303ac29a91909a91abfa5313c62f12c4f835189bddd275801bedc7"} Feb 17 14:26:04 crc kubenswrapper[4804]: I0217 14:26:04.546309 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4q2q" event={"ID":"6f377127-aca7-4b36-976b-fdc21aadd31b","Type":"ContainerStarted","Data":"588e0c01fb6efe70994ca955b961363bfa392c6732aced2b41d8b00a42135f3e"} Feb 17 14:26:05 crc kubenswrapper[4804]: I0217 14:26:05.558237 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4q2q" event={"ID":"6f377127-aca7-4b36-976b-fdc21aadd31b","Type":"ContainerStarted","Data":"5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c"} Feb 17 14:26:06 crc kubenswrapper[4804]: I0217 14:26:06.569337 4804 generic.go:334] "Generic (PLEG): container finished" podID="6f377127-aca7-4b36-976b-fdc21aadd31b" containerID="5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c" exitCode=0 Feb 17 14:26:06 crc kubenswrapper[4804]: I0217 14:26:06.569406 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4q2q" event={"ID":"6f377127-aca7-4b36-976b-fdc21aadd31b","Type":"ContainerDied","Data":"5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c"} Feb 17 14:26:06 crc kubenswrapper[4804]: I0217 14:26:06.569902 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4q2q" event={"ID":"6f377127-aca7-4b36-976b-fdc21aadd31b","Type":"ContainerStarted","Data":"e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf"} Feb 17 14:26:06 crc kubenswrapper[4804]: I0217 14:26:06.597741 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k4q2q" podStartSLOduration=3.15384256 podStartE2EDuration="4.597717558s" podCreationTimestamp="2026-02-17 14:26:02 +0000 UTC" firstStartedPulling="2026-02-17 14:26:04.549356045 +0000 UTC m=+3638.660775392" lastFinishedPulling="2026-02-17 14:26:05.993231053 +0000 UTC m=+3640.104650390" observedRunningTime="2026-02-17 14:26:06.593470854 +0000 UTC m=+3640.704890201" watchObservedRunningTime="2026-02-17 14:26:06.597717558 +0000 UTC m=+3640.709136895" Feb 17 14:26:06 crc kubenswrapper[4804]: I0217 14:26:06.940370 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-bgf7w_2158c202-5aa4-47aa-87a1-73e4b9043e78/nmstate-console-plugin/0.log" Feb 17 14:26:07 crc kubenswrapper[4804]: I0217 14:26:07.194237 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-8gkbz_18e3c061-8633-471f-b2ab-e87e3c0b5d44/kube-rbac-proxy/0.log" Feb 17 14:26:07 crc kubenswrapper[4804]: I0217 14:26:07.241152 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-jxn7r_81e46a71-360c-4509-ad38-2b2c814a56c2/nmstate-handler/0.log" Feb 17 14:26:07 crc kubenswrapper[4804]: I0217 14:26:07.270894 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-8gkbz_18e3c061-8633-471f-b2ab-e87e3c0b5d44/nmstate-metrics/0.log" Feb 17 14:26:07 crc kubenswrapper[4804]: I0217 14:26:07.399265 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-rkf7s_2789dcb9-5619-4986-a692-1eec733c97ff/nmstate-operator/0.log" Feb 17 14:26:07 crc kubenswrapper[4804]: I0217 14:26:07.511942 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-dbfqz_36fd4ae3-048e-4e51-b2fa-875a5c84b8e0/nmstate-webhook/0.log" Feb 17 14:26:13 crc kubenswrapper[4804]: I0217 14:26:13.305411 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:13 crc kubenswrapper[4804]: I0217 14:26:13.305931 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:13 crc kubenswrapper[4804]: I0217 14:26:13.363305 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:13 crc kubenswrapper[4804]: I0217 14:26:13.671895 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:13 crc kubenswrapper[4804]: I0217 14:26:13.714960 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k4q2q"] Feb 17 14:26:15 crc kubenswrapper[4804]: I0217 14:26:15.641870 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k4q2q" podUID="6f377127-aca7-4b36-976b-fdc21aadd31b" containerName="registry-server" containerID="cri-o://e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf" gracePeriod=2 Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.149412 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.303466 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-catalog-content\") pod \"6f377127-aca7-4b36-976b-fdc21aadd31b\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.303551 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-utilities\") pod \"6f377127-aca7-4b36-976b-fdc21aadd31b\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.303733 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x2mh\" (UniqueName: \"kubernetes.io/projected/6f377127-aca7-4b36-976b-fdc21aadd31b-kube-api-access-5x2mh\") pod \"6f377127-aca7-4b36-976b-fdc21aadd31b\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.304889 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-utilities" (OuterVolumeSpecName: "utilities") pod "6f377127-aca7-4b36-976b-fdc21aadd31b" (UID: "6f377127-aca7-4b36-976b-fdc21aadd31b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.309578 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f377127-aca7-4b36-976b-fdc21aadd31b-kube-api-access-5x2mh" (OuterVolumeSpecName: "kube-api-access-5x2mh") pod "6f377127-aca7-4b36-976b-fdc21aadd31b" (UID: "6f377127-aca7-4b36-976b-fdc21aadd31b"). InnerVolumeSpecName "kube-api-access-5x2mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.363732 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f377127-aca7-4b36-976b-fdc21aadd31b" (UID: "6f377127-aca7-4b36-976b-fdc21aadd31b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.406243 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.406288 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.406301 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x2mh\" (UniqueName: \"kubernetes.io/projected/6f377127-aca7-4b36-976b-fdc21aadd31b-kube-api-access-5x2mh\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.652170 4804 generic.go:334] "Generic (PLEG): container finished" podID="6f377127-aca7-4b36-976b-fdc21aadd31b" containerID="e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf" exitCode=0 Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.652239 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.652249 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4q2q" event={"ID":"6f377127-aca7-4b36-976b-fdc21aadd31b","Type":"ContainerDied","Data":"e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf"} Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.653372 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4q2q" event={"ID":"6f377127-aca7-4b36-976b-fdc21aadd31b","Type":"ContainerDied","Data":"588e0c01fb6efe70994ca955b961363bfa392c6732aced2b41d8b00a42135f3e"} Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.653390 4804 scope.go:117] "RemoveContainer" containerID="e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.680300 4804 scope.go:117] "RemoveContainer" containerID="5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.684237 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k4q2q"] Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.695242 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k4q2q"] Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.707909 4804 scope.go:117] "RemoveContainer" containerID="859966cd4b303ac29a91909a91abfa5313c62f12c4f835189bddd275801bedc7" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.762603 4804 scope.go:117] "RemoveContainer" containerID="e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf" Feb 17 14:26:16 crc kubenswrapper[4804]: E0217 14:26:16.763101 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf\": container with ID starting with e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf not found: ID does not exist" containerID="e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.763168 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf"} err="failed to get container status \"e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf\": rpc error: code = NotFound desc = could not find container \"e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf\": container with ID starting with e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf not found: ID does not exist" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.763227 4804 scope.go:117] "RemoveContainer" containerID="5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c" Feb 17 14:26:16 crc kubenswrapper[4804]: E0217 14:26:16.763556 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c\": container with ID starting with 5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c not found: ID does not exist" containerID="5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.763596 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c"} err="failed to get container status \"5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c\": rpc error: code = NotFound desc = could not find container \"5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c\": container with ID starting with 5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c not found: ID does not exist" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.763623 4804 scope.go:117] "RemoveContainer" containerID="859966cd4b303ac29a91909a91abfa5313c62f12c4f835189bddd275801bedc7" Feb 17 14:26:16 crc kubenswrapper[4804]: E0217 14:26:16.763913 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"859966cd4b303ac29a91909a91abfa5313c62f12c4f835189bddd275801bedc7\": container with ID starting with 859966cd4b303ac29a91909a91abfa5313c62f12c4f835189bddd275801bedc7 not found: ID does not exist" containerID="859966cd4b303ac29a91909a91abfa5313c62f12c4f835189bddd275801bedc7" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.763948 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"859966cd4b303ac29a91909a91abfa5313c62f12c4f835189bddd275801bedc7"} err="failed to get container status \"859966cd4b303ac29a91909a91abfa5313c62f12c4f835189bddd275801bedc7\": rpc error: code = NotFound desc = could not find container \"859966cd4b303ac29a91909a91abfa5313c62f12c4f835189bddd275801bedc7\": container with ID starting with 859966cd4b303ac29a91909a91abfa5313c62f12c4f835189bddd275801bedc7 not found: ID does not exist" Feb 17 14:26:18 crc kubenswrapper[4804]: I0217 14:26:18.585260 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f377127-aca7-4b36-976b-fdc21aadd31b" path="/var/lib/kubelet/pods/6f377127-aca7-4b36-976b-fdc21aadd31b/volumes" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.000546 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-wg4pd_01625c42-e1b1-470d-b705-47b30fec457a/kube-rbac-proxy/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.160469 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-wg4pd_01625c42-e1b1-470d-b705-47b30fec457a/controller/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.281594 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-frr-files/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.472663 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-reloader/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.475501 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-frr-files/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.476855 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-reloader/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.509131 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-metrics/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.688861 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-frr-files/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.692470 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-metrics/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.703122 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-reloader/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.784338 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-metrics/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.913956 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-reloader/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.949318 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-frr-files/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.959961 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-metrics/0.log" Feb 17 14:26:34 crc kubenswrapper[4804]: I0217 14:26:34.008016 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/controller/0.log" Feb 17 14:26:34 crc kubenswrapper[4804]: I0217 14:26:34.172671 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/frr-metrics/0.log" Feb 17 14:26:34 crc kubenswrapper[4804]: I0217 14:26:34.189541 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/kube-rbac-proxy/0.log" Feb 17 14:26:34 crc kubenswrapper[4804]: I0217 14:26:34.214087 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/kube-rbac-proxy-frr/0.log" Feb 17 14:26:34 crc kubenswrapper[4804]: I0217 14:26:34.416802 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/reloader/0.log" Feb 17 14:26:34 crc kubenswrapper[4804]: I0217 14:26:34.420168 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-gl8tp_0d003d1c-2370-4291-a035-0ebe8b97cfee/frr-k8s-webhook-server/0.log" Feb 17 14:26:34 crc kubenswrapper[4804]: I0217 14:26:34.652870 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-c7c468df9-kbjlb_c17333d4-cfc6-4129-af9e-a8f2db54988b/manager/0.log" Feb 17 14:26:34 crc kubenswrapper[4804]: I0217 14:26:34.846100 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wrsrf_ef60181c-19a6-454c-a197-2b0af0ac2edf/kube-rbac-proxy/0.log" Feb 17 14:26:34 crc kubenswrapper[4804]: I0217 14:26:34.894585 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-996ff79d9-vm8dt_82716046-7f15-43d7-b9de-8fdb68a44c0b/webhook-server/0.log" Feb 17 14:26:35 crc kubenswrapper[4804]: I0217 14:26:35.566489 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wrsrf_ef60181c-19a6-454c-a197-2b0af0ac2edf/speaker/0.log" Feb 17 14:26:35 crc kubenswrapper[4804]: I0217 14:26:35.634075 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/frr/0.log" Feb 17 14:26:48 crc kubenswrapper[4804]: I0217 14:26:48.349856 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/util/0.log" Feb 17 14:26:48 crc kubenswrapper[4804]: I0217 14:26:48.791787 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/pull/0.log" Feb 17 14:26:48 crc kubenswrapper[4804]: I0217 14:26:48.823007 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/util/0.log" Feb 17 14:26:48 crc kubenswrapper[4804]: I0217 14:26:48.824353 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/pull/0.log" Feb 17 14:26:48 crc kubenswrapper[4804]: I0217 14:26:48.949559 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/util/0.log" Feb 17 14:26:48 crc kubenswrapper[4804]: I0217 14:26:48.997113 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/pull/0.log" Feb 17 14:26:49 crc kubenswrapper[4804]: I0217 14:26:49.042803 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/extract/0.log" Feb 17 14:26:49 crc kubenswrapper[4804]: I0217 14:26:49.123030 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-utilities/0.log" Feb 17 14:26:49 crc kubenswrapper[4804]: I0217 14:26:49.338246 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-utilities/0.log" Feb 17 14:26:49 crc kubenswrapper[4804]: I0217 14:26:49.340103 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-content/0.log" Feb 17 14:26:49 crc kubenswrapper[4804]: I0217 14:26:49.381600 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-content/0.log" Feb 17 14:26:49 crc kubenswrapper[4804]: I0217 14:26:49.520456 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-utilities/0.log" Feb 17 14:26:49 crc kubenswrapper[4804]: I0217 14:26:49.566726 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-content/0.log" Feb 17 14:26:49 crc kubenswrapper[4804]: I0217 14:26:49.793119 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-utilities/0.log" Feb 17 14:26:50 crc kubenswrapper[4804]: I0217 14:26:50.047776 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/registry-server/0.log" Feb 17 14:26:50 crc kubenswrapper[4804]: I0217 14:26:50.055799 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-content/0.log" Feb 17 14:26:50 crc kubenswrapper[4804]: I0217 14:26:50.071590 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-utilities/0.log" Feb 17 14:26:50 crc kubenswrapper[4804]: I0217 14:26:50.085553 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-content/0.log" Feb 17 14:26:50 crc kubenswrapper[4804]: I0217 14:26:50.234425 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-content/0.log" Feb 17 14:26:50 crc kubenswrapper[4804]: I0217 14:26:50.261579 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-utilities/0.log" Feb 17 14:26:50 crc kubenswrapper[4804]: I0217 14:26:50.508519 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/util/0.log" Feb 17 14:26:50 crc kubenswrapper[4804]: I0217 14:26:50.658468 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/util/0.log" Feb 17 14:26:50 crc kubenswrapper[4804]: I0217 14:26:50.697418 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/pull/0.log" Feb 17 14:26:50 crc kubenswrapper[4804]: I0217 14:26:50.716473 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/pull/0.log" Feb 17 14:26:50 crc kubenswrapper[4804]: I0217 14:26:50.946231 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/util/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.020874 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/extract/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.022108 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/pull/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.063003 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/registry-server/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.292921 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-utilities/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.315956 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-26cwx_78a56ea9-6641-4d2d-8471-b40e5f2cf7e5/marketplace-operator/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.472328 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-utilities/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.484069 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-content/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.559177 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-content/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.737094 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-content/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.786745 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-utilities/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.913230 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/registry-server/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.960085 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhcxz_fdf90149-055d-48ca-9336-ca6d6545f8a3/extract-utilities/0.log" Feb 17 14:26:52 crc kubenswrapper[4804]: I0217 14:26:52.148422 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhcxz_fdf90149-055d-48ca-9336-ca6d6545f8a3/extract-content/0.log" Feb 17 14:26:52 crc kubenswrapper[4804]: I0217 14:26:52.149451 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhcxz_fdf90149-055d-48ca-9336-ca6d6545f8a3/extract-content/0.log" Feb 17 14:26:52 crc kubenswrapper[4804]: I0217 14:26:52.198153 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhcxz_fdf90149-055d-48ca-9336-ca6d6545f8a3/extract-utilities/0.log" Feb 17 14:26:52 crc kubenswrapper[4804]: I0217 14:26:52.368749 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhcxz_fdf90149-055d-48ca-9336-ca6d6545f8a3/extract-utilities/0.log" Feb 17 14:26:52 crc kubenswrapper[4804]: I0217 14:26:52.421929 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhcxz_fdf90149-055d-48ca-9336-ca6d6545f8a3/extract-content/0.log" Feb 17 14:26:52 crc kubenswrapper[4804]: I0217 14:26:52.856121 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhcxz_fdf90149-055d-48ca-9336-ca6d6545f8a3/registry-server/0.log" Feb 17 14:28:25 crc kubenswrapper[4804]: I0217 14:28:25.835044 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:28:25 crc kubenswrapper[4804]: I0217 14:28:25.835663 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:28:41 crc kubenswrapper[4804]: I0217 14:28:41.362085 4804 generic.go:334] "Generic (PLEG): container finished" podID="dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" containerID="ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336" exitCode=0 Feb 17 14:28:41 crc kubenswrapper[4804]: I0217 14:28:41.362218 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4qm/must-gather-49hd6" event={"ID":"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a","Type":"ContainerDied","Data":"ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336"} Feb 17 14:28:41 crc kubenswrapper[4804]: I0217 14:28:41.363574 4804 scope.go:117] "RemoveContainer" containerID="ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336" Feb 17 14:28:41 crc kubenswrapper[4804]: I0217 14:28:41.917860 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4k4qm_must-gather-49hd6_dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a/gather/0.log" Feb 17 14:28:49 crc kubenswrapper[4804]: I0217 14:28:49.424949 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4k4qm/must-gather-49hd6"] Feb 17 14:28:49 crc kubenswrapper[4804]: I0217 14:28:49.425738 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-4k4qm/must-gather-49hd6" podUID="dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" containerName="copy" containerID="cri-o://7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e" gracePeriod=2 Feb 17 14:28:49 crc kubenswrapper[4804]: I0217 14:28:49.436310 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4k4qm/must-gather-49hd6"] Feb 17 14:28:49 crc kubenswrapper[4804]: I0217 14:28:49.830739 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4k4qm_must-gather-49hd6_dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a/copy/0.log" Feb 17 14:28:49 crc kubenswrapper[4804]: I0217 14:28:49.831912 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/must-gather-49hd6" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.040021 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlhtq\" (UniqueName: \"kubernetes.io/projected/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-kube-api-access-zlhtq\") pod \"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a\" (UID: \"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a\") " Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.042893 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-must-gather-output\") pod \"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a\" (UID: \"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a\") " Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.056703 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-kube-api-access-zlhtq" (OuterVolumeSpecName: "kube-api-access-zlhtq") pod "dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" (UID: "dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a"). InnerVolumeSpecName "kube-api-access-zlhtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.147511 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlhtq\" (UniqueName: \"kubernetes.io/projected/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-kube-api-access-zlhtq\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.201054 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" (UID: "dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.248804 4804 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.454609 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4k4qm_must-gather-49hd6_dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a/copy/0.log" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.454945 4804 generic.go:334] "Generic (PLEG): container finished" podID="dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" containerID="7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e" exitCode=143 Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.455003 4804 scope.go:117] "RemoveContainer" containerID="7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.455183 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/must-gather-49hd6" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.482717 4804 scope.go:117] "RemoveContainer" containerID="ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.530316 4804 scope.go:117] "RemoveContainer" containerID="7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e" Feb 17 14:28:50 crc kubenswrapper[4804]: E0217 14:28:50.534782 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e\": container with ID starting with 7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e not found: ID does not exist" containerID="7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.535030 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e"} err="failed to get container status \"7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e\": rpc error: code = NotFound desc = could not find container \"7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e\": container with ID starting with 7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e not found: ID does not exist" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.535117 4804 scope.go:117] "RemoveContainer" containerID="ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336" Feb 17 14:28:50 crc kubenswrapper[4804]: E0217 14:28:50.535507 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336\": container with ID starting with ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336 not found: ID does not exist" containerID="ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.535540 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336"} err="failed to get container status \"ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336\": rpc error: code = NotFound desc = could not find container \"ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336\": container with ID starting with ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336 not found: ID does not exist" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.585983 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" path="/var/lib/kubelet/pods/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a/volumes" Feb 17 14:28:55 crc kubenswrapper[4804]: I0217 14:28:55.835580 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:28:55 crc kubenswrapper[4804]: I0217 14:28:55.836153 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:29:25 crc kubenswrapper[4804]: I0217 14:29:25.836309 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:29:25 crc kubenswrapper[4804]: I0217 14:29:25.836885 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:29:25 crc kubenswrapper[4804]: I0217 14:29:25.836937 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 14:29:25 crc kubenswrapper[4804]: I0217 14:29:25.837934 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:29:25 crc kubenswrapper[4804]: I0217 14:29:25.837980 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" gracePeriod=600 Feb 17 14:29:25 crc kubenswrapper[4804]: E0217 14:29:25.971451 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:29:26 crc kubenswrapper[4804]: I0217 14:29:26.795803 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" exitCode=0 Feb 17 14:29:26 crc kubenswrapper[4804]: I0217 14:29:26.795881 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db"} Feb 17 14:29:26 crc kubenswrapper[4804]: I0217 14:29:26.796188 4804 scope.go:117] "RemoveContainer" containerID="7d4a85eb120e127ec0b1aabea32973bfd4724fea056face9a5b718c636d4f49a" Feb 17 14:29:26 crc kubenswrapper[4804]: I0217 14:29:26.796792 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:29:26 crc kubenswrapper[4804]: E0217 14:29:26.797068 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:29:36 crc kubenswrapper[4804]: I0217 14:29:36.359860 4804 scope.go:117] "RemoveContainer" containerID="937494af4bed6ea8a3c15ac225b0965ab5ea21328dc383082b45d9d125e0f418" Feb 17 14:29:40 crc kubenswrapper[4804]: I0217 14:29:40.574908 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:29:40 crc kubenswrapper[4804]: E0217 14:29:40.575801 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:29:54 crc kubenswrapper[4804]: I0217 14:29:54.574607 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:29:54 crc kubenswrapper[4804]: E0217 14:29:54.575361 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.195505 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt"] Feb 17 14:30:00 crc kubenswrapper[4804]: E0217 14:30:00.198076 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" containerName="gather" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.198117 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" containerName="gather" Feb 17 14:30:00 crc kubenswrapper[4804]: E0217 14:30:00.198144 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f377127-aca7-4b36-976b-fdc21aadd31b" containerName="extract-content" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.198153 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f377127-aca7-4b36-976b-fdc21aadd31b" containerName="extract-content" Feb 17 14:30:00 crc kubenswrapper[4804]: E0217 14:30:00.198167 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f377127-aca7-4b36-976b-fdc21aadd31b" containerName="extract-utilities" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.198175 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f377127-aca7-4b36-976b-fdc21aadd31b" containerName="extract-utilities" Feb 17 14:30:00 crc kubenswrapper[4804]: E0217 14:30:00.198217 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" containerName="copy" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.198227 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" containerName="copy" Feb 17 14:30:00 crc kubenswrapper[4804]: E0217 14:30:00.198241 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f377127-aca7-4b36-976b-fdc21aadd31b" containerName="registry-server" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.198249 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f377127-aca7-4b36-976b-fdc21aadd31b" containerName="registry-server" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.198478 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" containerName="gather" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.198502 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f377127-aca7-4b36-976b-fdc21aadd31b" containerName="registry-server" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.198514 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" containerName="copy" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.199311 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.202108 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.202932 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.206782 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt"] Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.277862 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fb08c68-ef85-4035-b769-a0b54926b503-secret-volume\") pod \"collect-profiles-29522310-stvxt\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.277919 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxblm\" (UniqueName: \"kubernetes.io/projected/7fb08c68-ef85-4035-b769-a0b54926b503-kube-api-access-fxblm\") pod \"collect-profiles-29522310-stvxt\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.277964 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fb08c68-ef85-4035-b769-a0b54926b503-config-volume\") pod \"collect-profiles-29522310-stvxt\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.381877 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fb08c68-ef85-4035-b769-a0b54926b503-secret-volume\") pod \"collect-profiles-29522310-stvxt\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.382147 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxblm\" (UniqueName: \"kubernetes.io/projected/7fb08c68-ef85-4035-b769-a0b54926b503-kube-api-access-fxblm\") pod \"collect-profiles-29522310-stvxt\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.382193 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fb08c68-ef85-4035-b769-a0b54926b503-config-volume\") pod \"collect-profiles-29522310-stvxt\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.382979 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fb08c68-ef85-4035-b769-a0b54926b503-config-volume\") pod \"collect-profiles-29522310-stvxt\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.396421 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fb08c68-ef85-4035-b769-a0b54926b503-secret-volume\") pod \"collect-profiles-29522310-stvxt\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.409452 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxblm\" (UniqueName: \"kubernetes.io/projected/7fb08c68-ef85-4035-b769-a0b54926b503-kube-api-access-fxblm\") pod \"collect-profiles-29522310-stvxt\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.520658 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:01 crc kubenswrapper[4804]: I0217 14:30:01.010866 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt"] Feb 17 14:30:01 crc kubenswrapper[4804]: I0217 14:30:01.242929 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" event={"ID":"7fb08c68-ef85-4035-b769-a0b54926b503","Type":"ContainerStarted","Data":"d73341e9e6582eba29e761aaf883d659674c8bff051eb57c84cf289f2ed6dee3"} Feb 17 14:30:01 crc kubenswrapper[4804]: I0217 14:30:01.243300 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" event={"ID":"7fb08c68-ef85-4035-b769-a0b54926b503","Type":"ContainerStarted","Data":"465247d0d6d9048459debc68ffe300603683b95e428f418da14de8d8d23c31b3"} Feb 17 14:30:01 crc kubenswrapper[4804]: I0217 14:30:01.260583 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" podStartSLOduration=1.260557652 podStartE2EDuration="1.260557652s" podCreationTimestamp="2026-02-17 14:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:30:01.257644801 +0000 UTC m=+3875.369064138" watchObservedRunningTime="2026-02-17 14:30:01.260557652 +0000 UTC m=+3875.371976989" Feb 17 14:30:02 crc kubenswrapper[4804]: I0217 14:30:02.253309 4804 generic.go:334] "Generic (PLEG): container finished" podID="7fb08c68-ef85-4035-b769-a0b54926b503" containerID="d73341e9e6582eba29e761aaf883d659674c8bff051eb57c84cf289f2ed6dee3" exitCode=0 Feb 17 14:30:02 crc kubenswrapper[4804]: I0217 14:30:02.253415 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" event={"ID":"7fb08c68-ef85-4035-b769-a0b54926b503","Type":"ContainerDied","Data":"d73341e9e6582eba29e761aaf883d659674c8bff051eb57c84cf289f2ed6dee3"} Feb 17 14:30:03 crc kubenswrapper[4804]: I0217 14:30:03.579779 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:03 crc kubenswrapper[4804]: I0217 14:30:03.650900 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fb08c68-ef85-4035-b769-a0b54926b503-secret-volume\") pod \"7fb08c68-ef85-4035-b769-a0b54926b503\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " Feb 17 14:30:03 crc kubenswrapper[4804]: I0217 14:30:03.651174 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxblm\" (UniqueName: \"kubernetes.io/projected/7fb08c68-ef85-4035-b769-a0b54926b503-kube-api-access-fxblm\") pod \"7fb08c68-ef85-4035-b769-a0b54926b503\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " Feb 17 14:30:03 crc kubenswrapper[4804]: I0217 14:30:03.651316 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fb08c68-ef85-4035-b769-a0b54926b503-config-volume\") pod \"7fb08c68-ef85-4035-b769-a0b54926b503\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " Feb 17 14:30:03 crc kubenswrapper[4804]: I0217 14:30:03.651957 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fb08c68-ef85-4035-b769-a0b54926b503-config-volume" (OuterVolumeSpecName: "config-volume") pod "7fb08c68-ef85-4035-b769-a0b54926b503" (UID: "7fb08c68-ef85-4035-b769-a0b54926b503"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:30:03 crc kubenswrapper[4804]: I0217 14:30:03.653736 4804 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fb08c68-ef85-4035-b769-a0b54926b503-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:03 crc kubenswrapper[4804]: I0217 14:30:03.657383 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fb08c68-ef85-4035-b769-a0b54926b503-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7fb08c68-ef85-4035-b769-a0b54926b503" (UID: "7fb08c68-ef85-4035-b769-a0b54926b503"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:03 crc kubenswrapper[4804]: I0217 14:30:03.657403 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fb08c68-ef85-4035-b769-a0b54926b503-kube-api-access-fxblm" (OuterVolumeSpecName: "kube-api-access-fxblm") pod "7fb08c68-ef85-4035-b769-a0b54926b503" (UID: "7fb08c68-ef85-4035-b769-a0b54926b503"). InnerVolumeSpecName "kube-api-access-fxblm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:30:03 crc kubenswrapper[4804]: I0217 14:30:03.756249 4804 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fb08c68-ef85-4035-b769-a0b54926b503-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:03 crc kubenswrapper[4804]: I0217 14:30:03.756281 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxblm\" (UniqueName: \"kubernetes.io/projected/7fb08c68-ef85-4035-b769-a0b54926b503-kube-api-access-fxblm\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:04 crc kubenswrapper[4804]: I0217 14:30:04.272821 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" event={"ID":"7fb08c68-ef85-4035-b769-a0b54926b503","Type":"ContainerDied","Data":"465247d0d6d9048459debc68ffe300603683b95e428f418da14de8d8d23c31b3"} Feb 17 14:30:04 crc kubenswrapper[4804]: I0217 14:30:04.272855 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="465247d0d6d9048459debc68ffe300603683b95e428f418da14de8d8d23c31b3" Feb 17 14:30:04 crc kubenswrapper[4804]: I0217 14:30:04.272874 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:04 crc kubenswrapper[4804]: I0217 14:30:04.361782 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs"] Feb 17 14:30:04 crc kubenswrapper[4804]: I0217 14:30:04.370298 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs"] Feb 17 14:30:04 crc kubenswrapper[4804]: I0217 14:30:04.586116 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f" path="/var/lib/kubelet/pods/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f/volumes" Feb 17 14:30:07 crc kubenswrapper[4804]: I0217 14:30:07.574907 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:30:07 crc kubenswrapper[4804]: E0217 14:30:07.576661 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:30:18 crc kubenswrapper[4804]: I0217 14:30:18.574583 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:30:18 crc kubenswrapper[4804]: E0217 14:30:18.575515 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:30:30 crc kubenswrapper[4804]: I0217 14:30:30.745396 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:30:30 crc kubenswrapper[4804]: E0217 14:30:30.746705 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:30:36 crc kubenswrapper[4804]: I0217 14:30:36.426872 4804 scope.go:117] "RemoveContainer" containerID="96261c5dff8beaf5a66244a0c5555316f61e48042e355a630a22cedfabc69568" Feb 17 14:30:45 crc kubenswrapper[4804]: I0217 14:30:45.574316 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:30:45 crc kubenswrapper[4804]: E0217 14:30:45.574966 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:30:59 crc kubenswrapper[4804]: I0217 14:30:59.574152 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:30:59 crc kubenswrapper[4804]: E0217 14:30:59.574908 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.750299 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zn8mk"] Feb 17 14:31:06 crc kubenswrapper[4804]: E0217 14:31:06.751268 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb08c68-ef85-4035-b769-a0b54926b503" containerName="collect-profiles" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.751286 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb08c68-ef85-4035-b769-a0b54926b503" containerName="collect-profiles" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.751487 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fb08c68-ef85-4035-b769-a0b54926b503" containerName="collect-profiles" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.753124 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.762232 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zn8mk"] Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.789660 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsqw6\" (UniqueName: \"kubernetes.io/projected/3aa554a7-2c33-433d-89c1-403c44aa0215-kube-api-access-gsqw6\") pod \"redhat-operators-zn8mk\" (UID: \"3aa554a7-2c33-433d-89c1-403c44aa0215\") " pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.789740 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aa554a7-2c33-433d-89c1-403c44aa0215-utilities\") pod \"redhat-operators-zn8mk\" (UID: \"3aa554a7-2c33-433d-89c1-403c44aa0215\") " pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.789786 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aa554a7-2c33-433d-89c1-403c44aa0215-catalog-content\") pod \"redhat-operators-zn8mk\" (UID: \"3aa554a7-2c33-433d-89c1-403c44aa0215\") " pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.891167 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsqw6\" (UniqueName: \"kubernetes.io/projected/3aa554a7-2c33-433d-89c1-403c44aa0215-kube-api-access-gsqw6\") pod \"redhat-operators-zn8mk\" (UID: \"3aa554a7-2c33-433d-89c1-403c44aa0215\") " pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.891254 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aa554a7-2c33-433d-89c1-403c44aa0215-utilities\") pod \"redhat-operators-zn8mk\" (UID: \"3aa554a7-2c33-433d-89c1-403c44aa0215\") " pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.891333 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aa554a7-2c33-433d-89c1-403c44aa0215-catalog-content\") pod \"redhat-operators-zn8mk\" (UID: \"3aa554a7-2c33-433d-89c1-403c44aa0215\") " pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.891848 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aa554a7-2c33-433d-89c1-403c44aa0215-utilities\") pod \"redhat-operators-zn8mk\" (UID: \"3aa554a7-2c33-433d-89c1-403c44aa0215\") " pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.891875 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aa554a7-2c33-433d-89c1-403c44aa0215-catalog-content\") pod \"redhat-operators-zn8mk\" (UID: \"3aa554a7-2c33-433d-89c1-403c44aa0215\") " pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.912933 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsqw6\" (UniqueName: \"kubernetes.io/projected/3aa554a7-2c33-433d-89c1-403c44aa0215-kube-api-access-gsqw6\") pod \"redhat-operators-zn8mk\" (UID: \"3aa554a7-2c33-433d-89c1-403c44aa0215\") " pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:07 crc kubenswrapper[4804]: I0217 14:31:07.076378 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:07 crc kubenswrapper[4804]: I0217 14:31:07.625959 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zn8mk"] Feb 17 14:31:08 crc kubenswrapper[4804]: E0217 14:31:08.018150 4804 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aa554a7_2c33_433d_89c1_403c44aa0215.slice/crio-conmon-6d6f8bce427f488369830d0db771a2f0ebc9acafad3d11ad30fd973780d11e19.scope\": RecentStats: unable to find data in memory cache]" Feb 17 14:31:08 crc kubenswrapper[4804]: I0217 14:31:08.140847 4804 generic.go:334] "Generic (PLEG): container finished" podID="3aa554a7-2c33-433d-89c1-403c44aa0215" containerID="6d6f8bce427f488369830d0db771a2f0ebc9acafad3d11ad30fd973780d11e19" exitCode=0 Feb 17 14:31:08 crc kubenswrapper[4804]: I0217 14:31:08.140901 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn8mk" event={"ID":"3aa554a7-2c33-433d-89c1-403c44aa0215","Type":"ContainerDied","Data":"6d6f8bce427f488369830d0db771a2f0ebc9acafad3d11ad30fd973780d11e19"} Feb 17 14:31:08 crc kubenswrapper[4804]: I0217 14:31:08.140932 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn8mk" event={"ID":"3aa554a7-2c33-433d-89c1-403c44aa0215","Type":"ContainerStarted","Data":"341fac245a4b0d13d49b64dfe9663a0128241e83acbecdb03149d70c90f7a0a5"} Feb 17 14:31:08 crc kubenswrapper[4804]: I0217 14:31:08.142781 4804 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:31:14 crc kubenswrapper[4804]: I0217 14:31:14.574721 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:31:14 crc kubenswrapper[4804]: E0217 14:31:14.575743 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:31:17 crc kubenswrapper[4804]: I0217 14:31:17.224366 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn8mk" event={"ID":"3aa554a7-2c33-433d-89c1-403c44aa0215","Type":"ContainerStarted","Data":"7c21f8597daa07cd863767e37bd4fca56d166a3f2cc1394deb226b8f50cefd20"} Feb 17 14:31:19 crc kubenswrapper[4804]: I0217 14:31:19.242846 4804 generic.go:334] "Generic (PLEG): container finished" podID="3aa554a7-2c33-433d-89c1-403c44aa0215" containerID="7c21f8597daa07cd863767e37bd4fca56d166a3f2cc1394deb226b8f50cefd20" exitCode=0 Feb 17 14:31:19 crc kubenswrapper[4804]: I0217 14:31:19.242955 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn8mk" event={"ID":"3aa554a7-2c33-433d-89c1-403c44aa0215","Type":"ContainerDied","Data":"7c21f8597daa07cd863767e37bd4fca56d166a3f2cc1394deb226b8f50cefd20"} Feb 17 14:31:20 crc kubenswrapper[4804]: I0217 14:31:20.256977 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn8mk" event={"ID":"3aa554a7-2c33-433d-89c1-403c44aa0215","Type":"ContainerStarted","Data":"c8b081f9fe887ef42d98a032a66e58fb0f063c9b148b47aefb5385b2ba5b192e"} Feb 17 14:31:20 crc kubenswrapper[4804]: I0217 14:31:20.275805 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zn8mk" podStartSLOduration=2.733228632 podStartE2EDuration="14.275785008s" podCreationTimestamp="2026-02-17 14:31:06 +0000 UTC" firstStartedPulling="2026-02-17 14:31:08.142583072 +0000 UTC m=+3942.254002409" lastFinishedPulling="2026-02-17 14:31:19.685139438 +0000 UTC m=+3953.796558785" observedRunningTime="2026-02-17 14:31:20.274781196 +0000 UTC m=+3954.386200543" watchObservedRunningTime="2026-02-17 14:31:20.275785008 +0000 UTC m=+3954.387204345" Feb 17 14:31:25 crc kubenswrapper[4804]: I0217 14:31:25.574418 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:31:25 crc kubenswrapper[4804]: E0217 14:31:25.575333 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:31:27 crc kubenswrapper[4804]: I0217 14:31:27.077827 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:27 crc kubenswrapper[4804]: I0217 14:31:27.078168 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:27 crc kubenswrapper[4804]: I0217 14:31:27.501284 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:27 crc kubenswrapper[4804]: I0217 14:31:27.555328 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:27 crc kubenswrapper[4804]: I0217 14:31:27.624425 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zn8mk"] Feb 17 14:31:27 crc kubenswrapper[4804]: I0217 14:31:27.744723 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bhcxz"] Feb 17 14:31:27 crc kubenswrapper[4804]: I0217 14:31:27.744978 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bhcxz" podUID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerName="registry-server" containerID="cri-o://99e2aa4e9ffd4764c886e89b267517bc69e0446a4dde7f269ace85ac34cf8bca" gracePeriod=2 Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.338894 4804 generic.go:334] "Generic (PLEG): container finished" podID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerID="99e2aa4e9ffd4764c886e89b267517bc69e0446a4dde7f269ace85ac34cf8bca" exitCode=0 Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.338966 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhcxz" event={"ID":"fdf90149-055d-48ca-9336-ca6d6545f8a3","Type":"ContainerDied","Data":"99e2aa4e9ffd4764c886e89b267517bc69e0446a4dde7f269ace85ac34cf8bca"} Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.339266 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhcxz" event={"ID":"fdf90149-055d-48ca-9336-ca6d6545f8a3","Type":"ContainerDied","Data":"b36f1a9d3bb9bce0d65ed7aea0a70bcace69dc6992d2df01c00c6c4e740bc208"} Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.339278 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b36f1a9d3bb9bce0d65ed7aea0a70bcace69dc6992d2df01c00c6c4e740bc208" Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.364340 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.467959 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7rn4\" (UniqueName: \"kubernetes.io/projected/fdf90149-055d-48ca-9336-ca6d6545f8a3-kube-api-access-l7rn4\") pod \"fdf90149-055d-48ca-9336-ca6d6545f8a3\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.468065 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-utilities\") pod \"fdf90149-055d-48ca-9336-ca6d6545f8a3\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.468158 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-catalog-content\") pod \"fdf90149-055d-48ca-9336-ca6d6545f8a3\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.469523 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-utilities" (OuterVolumeSpecName: "utilities") pod "fdf90149-055d-48ca-9336-ca6d6545f8a3" (UID: "fdf90149-055d-48ca-9336-ca6d6545f8a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.477056 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdf90149-055d-48ca-9336-ca6d6545f8a3-kube-api-access-l7rn4" (OuterVolumeSpecName: "kube-api-access-l7rn4") pod "fdf90149-055d-48ca-9336-ca6d6545f8a3" (UID: "fdf90149-055d-48ca-9336-ca6d6545f8a3"). InnerVolumeSpecName "kube-api-access-l7rn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.570679 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7rn4\" (UniqueName: \"kubernetes.io/projected/fdf90149-055d-48ca-9336-ca6d6545f8a3-kube-api-access-l7rn4\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.570977 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.607183 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fdf90149-055d-48ca-9336-ca6d6545f8a3" (UID: "fdf90149-055d-48ca-9336-ca6d6545f8a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.674690 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:29 crc kubenswrapper[4804]: I0217 14:31:29.346355 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 14:31:29 crc kubenswrapper[4804]: I0217 14:31:29.382096 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bhcxz"] Feb 17 14:31:29 crc kubenswrapper[4804]: I0217 14:31:29.398720 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bhcxz"] Feb 17 14:31:30 crc kubenswrapper[4804]: I0217 14:31:30.585922 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdf90149-055d-48ca-9336-ca6d6545f8a3" path="/var/lib/kubelet/pods/fdf90149-055d-48ca-9336-ca6d6545f8a3/volumes" Feb 17 14:31:36 crc kubenswrapper[4804]: I0217 14:31:36.493324 4804 scope.go:117] "RemoveContainer" containerID="99e2aa4e9ffd4764c886e89b267517bc69e0446a4dde7f269ace85ac34cf8bca" Feb 17 14:31:36 crc kubenswrapper[4804]: I0217 14:31:36.537024 4804 scope.go:117] "RemoveContainer" containerID="655e7850618eb7f1a6d3ae03ba1313c40721cf71550535d385a4aa123058d615" Feb 17 14:31:36 crc kubenswrapper[4804]: I0217 14:31:36.564365 4804 scope.go:117] "RemoveContainer" containerID="aec9aafaeb0231fd50b93156ef23ec8d4f34ac9ec3ae7c91631e24543663c093" Feb 17 14:31:40 crc kubenswrapper[4804]: I0217 14:31:40.573895 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:31:40 crc kubenswrapper[4804]: E0217 14:31:40.574591 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.660433 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6mdsd/must-gather-64wjc"] Feb 17 14:31:48 crc kubenswrapper[4804]: E0217 14:31:48.661484 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerName="extract-utilities" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.661500 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerName="extract-utilities" Feb 17 14:31:48 crc kubenswrapper[4804]: E0217 14:31:48.661534 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerName="extract-content" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.661543 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerName="extract-content" Feb 17 14:31:48 crc kubenswrapper[4804]: E0217 14:31:48.661561 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerName="registry-server" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.661569 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerName="registry-server" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.661814 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerName="registry-server" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.663044 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/must-gather-64wjc" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.665248 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6mdsd"/"openshift-service-ca.crt" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.665505 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-6mdsd"/"default-dockercfg-dhrsb" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.665606 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6mdsd"/"kube-root-ca.crt" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.671474 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6mdsd/must-gather-64wjc"] Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.693580 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-must-gather-output\") pod \"must-gather-64wjc\" (UID: \"de9029fd-fb98-4bf0-a6fc-0baf663a4e92\") " pod="openshift-must-gather-6mdsd/must-gather-64wjc" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.693878 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xth5r\" (UniqueName: \"kubernetes.io/projected/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-kube-api-access-xth5r\") pod \"must-gather-64wjc\" (UID: \"de9029fd-fb98-4bf0-a6fc-0baf663a4e92\") " pod="openshift-must-gather-6mdsd/must-gather-64wjc" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.795560 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-must-gather-output\") pod \"must-gather-64wjc\" (UID: \"de9029fd-fb98-4bf0-a6fc-0baf663a4e92\") " pod="openshift-must-gather-6mdsd/must-gather-64wjc" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.795607 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xth5r\" (UniqueName: \"kubernetes.io/projected/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-kube-api-access-xth5r\") pod \"must-gather-64wjc\" (UID: \"de9029fd-fb98-4bf0-a6fc-0baf663a4e92\") " pod="openshift-must-gather-6mdsd/must-gather-64wjc" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.796204 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-must-gather-output\") pod \"must-gather-64wjc\" (UID: \"de9029fd-fb98-4bf0-a6fc-0baf663a4e92\") " pod="openshift-must-gather-6mdsd/must-gather-64wjc" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.817352 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xth5r\" (UniqueName: \"kubernetes.io/projected/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-kube-api-access-xth5r\") pod \"must-gather-64wjc\" (UID: \"de9029fd-fb98-4bf0-a6fc-0baf663a4e92\") " pod="openshift-must-gather-6mdsd/must-gather-64wjc" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.979715 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/must-gather-64wjc" Feb 17 14:31:49 crc kubenswrapper[4804]: I0217 14:31:49.448420 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6mdsd/must-gather-64wjc"] Feb 17 14:31:49 crc kubenswrapper[4804]: W0217 14:31:49.453661 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde9029fd_fb98_4bf0_a6fc_0baf663a4e92.slice/crio-0f10fabdb124ae129b670cabd84de1d0518943d7f62e87c77537f6f81cb52341 WatchSource:0}: Error finding container 0f10fabdb124ae129b670cabd84de1d0518943d7f62e87c77537f6f81cb52341: Status 404 returned error can't find the container with id 0f10fabdb124ae129b670cabd84de1d0518943d7f62e87c77537f6f81cb52341 Feb 17 14:31:49 crc kubenswrapper[4804]: I0217 14:31:49.519562 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mdsd/must-gather-64wjc" event={"ID":"de9029fd-fb98-4bf0-a6fc-0baf663a4e92","Type":"ContainerStarted","Data":"0f10fabdb124ae129b670cabd84de1d0518943d7f62e87c77537f6f81cb52341"} Feb 17 14:31:50 crc kubenswrapper[4804]: I0217 14:31:50.532175 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mdsd/must-gather-64wjc" event={"ID":"de9029fd-fb98-4bf0-a6fc-0baf663a4e92","Type":"ContainerStarted","Data":"ba8a99b1d53310cd598e93015ecdc8bff1c0871f8d9af2216aa4262da6b1fde1"} Feb 17 14:31:51 crc kubenswrapper[4804]: I0217 14:31:51.542190 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mdsd/must-gather-64wjc" event={"ID":"de9029fd-fb98-4bf0-a6fc-0baf663a4e92","Type":"ContainerStarted","Data":"8a5d5495b17851f93d14861ab3120bb7a96ba669e31d998c9788362daafedc67"} Feb 17 14:31:51 crc kubenswrapper[4804]: I0217 14:31:51.567522 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6mdsd/must-gather-64wjc" podStartSLOduration=3.567475767 podStartE2EDuration="3.567475767s" podCreationTimestamp="2026-02-17 14:31:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:51.557266717 +0000 UTC m=+3985.668686054" watchObservedRunningTime="2026-02-17 14:31:51.567475767 +0000 UTC m=+3985.678895104" Feb 17 14:31:54 crc kubenswrapper[4804]: I0217 14:31:54.338691 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6mdsd/crc-debug-m6kvm"] Feb 17 14:31:54 crc kubenswrapper[4804]: I0217 14:31:54.344186 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" Feb 17 14:31:54 crc kubenswrapper[4804]: I0217 14:31:54.413409 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-host\") pod \"crc-debug-m6kvm\" (UID: \"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6\") " pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" Feb 17 14:31:54 crc kubenswrapper[4804]: I0217 14:31:54.413706 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncv6n\" (UniqueName: \"kubernetes.io/projected/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-kube-api-access-ncv6n\") pod \"crc-debug-m6kvm\" (UID: \"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6\") " pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" Feb 17 14:31:54 crc kubenswrapper[4804]: I0217 14:31:54.514943 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-host\") pod \"crc-debug-m6kvm\" (UID: \"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6\") " pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" Feb 17 14:31:54 crc kubenswrapper[4804]: I0217 14:31:54.515002 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncv6n\" (UniqueName: \"kubernetes.io/projected/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-kube-api-access-ncv6n\") pod \"crc-debug-m6kvm\" (UID: \"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6\") " pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" Feb 17 14:31:54 crc kubenswrapper[4804]: I0217 14:31:54.515144 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-host\") pod \"crc-debug-m6kvm\" (UID: \"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6\") " pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" Feb 17 14:31:54 crc kubenswrapper[4804]: I0217 14:31:54.543314 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncv6n\" (UniqueName: \"kubernetes.io/projected/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-kube-api-access-ncv6n\") pod \"crc-debug-m6kvm\" (UID: \"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6\") " pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" Feb 17 14:31:54 crc kubenswrapper[4804]: I0217 14:31:54.573733 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:31:54 crc kubenswrapper[4804]: E0217 14:31:54.574021 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:31:54 crc kubenswrapper[4804]: I0217 14:31:54.673868 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" Feb 17 14:31:54 crc kubenswrapper[4804]: W0217 14:31:54.728931 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6c19278_b1a0_4a84_b3c5_70b6cc6ad7f6.slice/crio-9068d964985dc3f74636fc38444e9611dd2eacf2629fc94b18531687a7e7c765 WatchSource:0}: Error finding container 9068d964985dc3f74636fc38444e9611dd2eacf2629fc94b18531687a7e7c765: Status 404 returned error can't find the container with id 9068d964985dc3f74636fc38444e9611dd2eacf2629fc94b18531687a7e7c765 Feb 17 14:31:55 crc kubenswrapper[4804]: I0217 14:31:55.594874 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" event={"ID":"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6","Type":"ContainerStarted","Data":"48ccee6d738eec52fe3ebe02fb9da09777623625ef4e66906d7c6643e3e9b779"} Feb 17 14:31:55 crc kubenswrapper[4804]: I0217 14:31:55.595400 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" event={"ID":"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6","Type":"ContainerStarted","Data":"9068d964985dc3f74636fc38444e9611dd2eacf2629fc94b18531687a7e7c765"} Feb 17 14:31:55 crc kubenswrapper[4804]: I0217 14:31:55.614048 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" podStartSLOduration=1.614030247 podStartE2EDuration="1.614030247s" podCreationTimestamp="2026-02-17 14:31:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:55.605882212 +0000 UTC m=+3989.717301549" watchObservedRunningTime="2026-02-17 14:31:55.614030247 +0000 UTC m=+3989.725449584" Feb 17 14:32:07 crc kubenswrapper[4804]: I0217 14:32:07.574240 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:32:07 crc kubenswrapper[4804]: E0217 14:32:07.575154 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:32:19 crc kubenswrapper[4804]: I0217 14:32:19.573789 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:32:19 crc kubenswrapper[4804]: E0217 14:32:19.574672 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:32:28 crc kubenswrapper[4804]: I0217 14:32:28.909944 4804 generic.go:334] "Generic (PLEG): container finished" podID="e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6" containerID="48ccee6d738eec52fe3ebe02fb9da09777623625ef4e66906d7c6643e3e9b779" exitCode=0 Feb 17 14:32:28 crc kubenswrapper[4804]: I0217 14:32:28.910138 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" event={"ID":"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6","Type":"ContainerDied","Data":"48ccee6d738eec52fe3ebe02fb9da09777623625ef4e66906d7c6643e3e9b779"} Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.177914 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.214685 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6mdsd/crc-debug-m6kvm"] Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.223245 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6mdsd/crc-debug-m6kvm"] Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.225743 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-host\") pod \"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6\" (UID: \"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6\") " Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.225808 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncv6n\" (UniqueName: \"kubernetes.io/projected/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-kube-api-access-ncv6n\") pod \"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6\" (UID: \"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6\") " Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.227422 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-host" (OuterVolumeSpecName: "host") pod "e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6" (UID: "e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.231812 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-kube-api-access-ncv6n" (OuterVolumeSpecName: "kube-api-access-ncv6n") pod "e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6" (UID: "e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6"). InnerVolumeSpecName "kube-api-access-ncv6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.328030 4804 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-host\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.328072 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncv6n\" (UniqueName: \"kubernetes.io/projected/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-kube-api-access-ncv6n\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.584468 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6" path="/var/lib/kubelet/pods/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6/volumes" Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.942527 4804 scope.go:117] "RemoveContainer" containerID="48ccee6d738eec52fe3ebe02fb9da09777623625ef4e66906d7c6643e3e9b779" Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.942676 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.444740 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6mdsd/crc-debug-2pwmf"] Feb 17 14:32:31 crc kubenswrapper[4804]: E0217 14:32:31.445555 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6" containerName="container-00" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.445574 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6" containerName="container-00" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.445768 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6" containerName="container-00" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.446416 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.555442 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/768fb954-46e9-4df8-89f9-b20f65c39f9e-host\") pod \"crc-debug-2pwmf\" (UID: \"768fb954-46e9-4df8-89f9-b20f65c39f9e\") " pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.555574 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvc5t\" (UniqueName: \"kubernetes.io/projected/768fb954-46e9-4df8-89f9-b20f65c39f9e-kube-api-access-wvc5t\") pod \"crc-debug-2pwmf\" (UID: \"768fb954-46e9-4df8-89f9-b20f65c39f9e\") " pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.658043 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/768fb954-46e9-4df8-89f9-b20f65c39f9e-host\") pod \"crc-debug-2pwmf\" (UID: \"768fb954-46e9-4df8-89f9-b20f65c39f9e\") " pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.658183 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/768fb954-46e9-4df8-89f9-b20f65c39f9e-host\") pod \"crc-debug-2pwmf\" (UID: \"768fb954-46e9-4df8-89f9-b20f65c39f9e\") " pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.658187 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvc5t\" (UniqueName: \"kubernetes.io/projected/768fb954-46e9-4df8-89f9-b20f65c39f9e-kube-api-access-wvc5t\") pod \"crc-debug-2pwmf\" (UID: \"768fb954-46e9-4df8-89f9-b20f65c39f9e\") " pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.688897 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvc5t\" (UniqueName: \"kubernetes.io/projected/768fb954-46e9-4df8-89f9-b20f65c39f9e-kube-api-access-wvc5t\") pod \"crc-debug-2pwmf\" (UID: \"768fb954-46e9-4df8-89f9-b20f65c39f9e\") " pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.772314 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.953285 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" event={"ID":"768fb954-46e9-4df8-89f9-b20f65c39f9e","Type":"ContainerStarted","Data":"989bfe1a7fa274157f4057abd8aa1fc74de862c999a0d219cb684917d6013e54"} Feb 17 14:32:32 crc kubenswrapper[4804]: I0217 14:32:32.963251 4804 generic.go:334] "Generic (PLEG): container finished" podID="768fb954-46e9-4df8-89f9-b20f65c39f9e" containerID="c72740c43ea69de8ef43e8dc6df1e3ebad0283045c01f7e18f815758350cdc07" exitCode=0 Feb 17 14:32:32 crc kubenswrapper[4804]: I0217 14:32:32.963370 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" event={"ID":"768fb954-46e9-4df8-89f9-b20f65c39f9e","Type":"ContainerDied","Data":"c72740c43ea69de8ef43e8dc6df1e3ebad0283045c01f7e18f815758350cdc07"} Feb 17 14:32:33 crc kubenswrapper[4804]: I0217 14:32:33.408650 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6mdsd/crc-debug-2pwmf"] Feb 17 14:32:33 crc kubenswrapper[4804]: I0217 14:32:33.423467 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6mdsd/crc-debug-2pwmf"] Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.114338 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.205003 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/768fb954-46e9-4df8-89f9-b20f65c39f9e-host\") pod \"768fb954-46e9-4df8-89f9-b20f65c39f9e\" (UID: \"768fb954-46e9-4df8-89f9-b20f65c39f9e\") " Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.205158 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/768fb954-46e9-4df8-89f9-b20f65c39f9e-host" (OuterVolumeSpecName: "host") pod "768fb954-46e9-4df8-89f9-b20f65c39f9e" (UID: "768fb954-46e9-4df8-89f9-b20f65c39f9e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.205406 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvc5t\" (UniqueName: \"kubernetes.io/projected/768fb954-46e9-4df8-89f9-b20f65c39f9e-kube-api-access-wvc5t\") pod \"768fb954-46e9-4df8-89f9-b20f65c39f9e\" (UID: \"768fb954-46e9-4df8-89f9-b20f65c39f9e\") " Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.206048 4804 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/768fb954-46e9-4df8-89f9-b20f65c39f9e-host\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.210786 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/768fb954-46e9-4df8-89f9-b20f65c39f9e-kube-api-access-wvc5t" (OuterVolumeSpecName: "kube-api-access-wvc5t") pod "768fb954-46e9-4df8-89f9-b20f65c39f9e" (UID: "768fb954-46e9-4df8-89f9-b20f65c39f9e"). InnerVolumeSpecName "kube-api-access-wvc5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.308310 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvc5t\" (UniqueName: \"kubernetes.io/projected/768fb954-46e9-4df8-89f9-b20f65c39f9e-kube-api-access-wvc5t\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.573898 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:32:34 crc kubenswrapper[4804]: E0217 14:32:34.574256 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.585924 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="768fb954-46e9-4df8-89f9-b20f65c39f9e" path="/var/lib/kubelet/pods/768fb954-46e9-4df8-89f9-b20f65c39f9e/volumes" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.742290 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6mdsd/crc-debug-66jj7"] Feb 17 14:32:34 crc kubenswrapper[4804]: E0217 14:32:34.742679 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="768fb954-46e9-4df8-89f9-b20f65c39f9e" containerName="container-00" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.742697 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="768fb954-46e9-4df8-89f9-b20f65c39f9e" containerName="container-00" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.742880 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="768fb954-46e9-4df8-89f9-b20f65c39f9e" containerName="container-00" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.743490 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-66jj7" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.817808 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb9x5\" (UniqueName: \"kubernetes.io/projected/01160288-3510-4001-8a02-c356f2b354f1-kube-api-access-zb9x5\") pod \"crc-debug-66jj7\" (UID: \"01160288-3510-4001-8a02-c356f2b354f1\") " pod="openshift-must-gather-6mdsd/crc-debug-66jj7" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.817905 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01160288-3510-4001-8a02-c356f2b354f1-host\") pod \"crc-debug-66jj7\" (UID: \"01160288-3510-4001-8a02-c356f2b354f1\") " pod="openshift-must-gather-6mdsd/crc-debug-66jj7" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.919181 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01160288-3510-4001-8a02-c356f2b354f1-host\") pod \"crc-debug-66jj7\" (UID: \"01160288-3510-4001-8a02-c356f2b354f1\") " pod="openshift-must-gather-6mdsd/crc-debug-66jj7" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.919377 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb9x5\" (UniqueName: \"kubernetes.io/projected/01160288-3510-4001-8a02-c356f2b354f1-kube-api-access-zb9x5\") pod \"crc-debug-66jj7\" (UID: \"01160288-3510-4001-8a02-c356f2b354f1\") " pod="openshift-must-gather-6mdsd/crc-debug-66jj7" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.919422 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01160288-3510-4001-8a02-c356f2b354f1-host\") pod \"crc-debug-66jj7\" (UID: \"01160288-3510-4001-8a02-c356f2b354f1\") " pod="openshift-must-gather-6mdsd/crc-debug-66jj7" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.939529 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb9x5\" (UniqueName: \"kubernetes.io/projected/01160288-3510-4001-8a02-c356f2b354f1-kube-api-access-zb9x5\") pod \"crc-debug-66jj7\" (UID: \"01160288-3510-4001-8a02-c356f2b354f1\") " pod="openshift-must-gather-6mdsd/crc-debug-66jj7" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.986129 4804 scope.go:117] "RemoveContainer" containerID="c72740c43ea69de8ef43e8dc6df1e3ebad0283045c01f7e18f815758350cdc07" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.986176 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" Feb 17 14:32:35 crc kubenswrapper[4804]: I0217 14:32:35.059656 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-66jj7" Feb 17 14:32:35 crc kubenswrapper[4804]: I0217 14:32:35.995339 4804 generic.go:334] "Generic (PLEG): container finished" podID="01160288-3510-4001-8a02-c356f2b354f1" containerID="e3cf401ef670ad8e345476461ab15c106659c4047295fe10a7113e202c7d1745" exitCode=0 Feb 17 14:32:35 crc kubenswrapper[4804]: I0217 14:32:35.995780 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mdsd/crc-debug-66jj7" event={"ID":"01160288-3510-4001-8a02-c356f2b354f1","Type":"ContainerDied","Data":"e3cf401ef670ad8e345476461ab15c106659c4047295fe10a7113e202c7d1745"} Feb 17 14:32:35 crc kubenswrapper[4804]: I0217 14:32:35.995805 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mdsd/crc-debug-66jj7" event={"ID":"01160288-3510-4001-8a02-c356f2b354f1","Type":"ContainerStarted","Data":"2c574eafadbda7ee7470a18eebfb317e6ae81c8e1704b0d945ce0de7b25c2705"} Feb 17 14:32:36 crc kubenswrapper[4804]: I0217 14:32:36.037651 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6mdsd/crc-debug-66jj7"] Feb 17 14:32:36 crc kubenswrapper[4804]: I0217 14:32:36.047534 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6mdsd/crc-debug-66jj7"] Feb 17 14:32:37 crc kubenswrapper[4804]: I0217 14:32:37.106340 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-66jj7" Feb 17 14:32:37 crc kubenswrapper[4804]: I0217 14:32:37.160772 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01160288-3510-4001-8a02-c356f2b354f1-host\") pod \"01160288-3510-4001-8a02-c356f2b354f1\" (UID: \"01160288-3510-4001-8a02-c356f2b354f1\") " Feb 17 14:32:37 crc kubenswrapper[4804]: I0217 14:32:37.161127 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb9x5\" (UniqueName: \"kubernetes.io/projected/01160288-3510-4001-8a02-c356f2b354f1-kube-api-access-zb9x5\") pod \"01160288-3510-4001-8a02-c356f2b354f1\" (UID: \"01160288-3510-4001-8a02-c356f2b354f1\") " Feb 17 14:32:37 crc kubenswrapper[4804]: I0217 14:32:37.160855 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01160288-3510-4001-8a02-c356f2b354f1-host" (OuterVolumeSpecName: "host") pod "01160288-3510-4001-8a02-c356f2b354f1" (UID: "01160288-3510-4001-8a02-c356f2b354f1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:32:37 crc kubenswrapper[4804]: I0217 14:32:37.161722 4804 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01160288-3510-4001-8a02-c356f2b354f1-host\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:37 crc kubenswrapper[4804]: I0217 14:32:37.165905 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01160288-3510-4001-8a02-c356f2b354f1-kube-api-access-zb9x5" (OuterVolumeSpecName: "kube-api-access-zb9x5") pod "01160288-3510-4001-8a02-c356f2b354f1" (UID: "01160288-3510-4001-8a02-c356f2b354f1"). InnerVolumeSpecName "kube-api-access-zb9x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:32:37 crc kubenswrapper[4804]: I0217 14:32:37.263616 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb9x5\" (UniqueName: \"kubernetes.io/projected/01160288-3510-4001-8a02-c356f2b354f1-kube-api-access-zb9x5\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:38 crc kubenswrapper[4804]: I0217 14:32:38.015572 4804 scope.go:117] "RemoveContainer" containerID="e3cf401ef670ad8e345476461ab15c106659c4047295fe10a7113e202c7d1745" Feb 17 14:32:38 crc kubenswrapper[4804]: I0217 14:32:38.015731 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-66jj7" Feb 17 14:32:38 crc kubenswrapper[4804]: I0217 14:32:38.586645 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01160288-3510-4001-8a02-c356f2b354f1" path="/var/lib/kubelet/pods/01160288-3510-4001-8a02-c356f2b354f1/volumes" Feb 17 14:32:45 crc kubenswrapper[4804]: I0217 14:32:45.574157 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:32:45 crc kubenswrapper[4804]: E0217 14:32:45.574885 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:32:59 crc kubenswrapper[4804]: I0217 14:32:59.573991 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:32:59 crc kubenswrapper[4804]: E0217 14:32:59.574759 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:33:12 crc kubenswrapper[4804]: I0217 14:33:12.430984 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cc7c97fdd-bhd7w_2b89da32-9537-4c7b-a266-0d38ac52b069/barbican-api/0.log" Feb 17 14:33:12 crc kubenswrapper[4804]: I0217 14:33:12.459523 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cc7c97fdd-bhd7w_2b89da32-9537-4c7b-a266-0d38ac52b069/barbican-api-log/0.log" Feb 17 14:33:12 crc kubenswrapper[4804]: I0217 14:33:12.639587 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f46489f4-x24zj_297a0648-3cbd-4f1e-8bc4-d918a702c33b/barbican-keystone-listener/0.log" Feb 17 14:33:12 crc kubenswrapper[4804]: I0217 14:33:12.657916 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f46489f4-x24zj_297a0648-3cbd-4f1e-8bc4-d918a702c33b/barbican-keystone-listener-log/0.log" Feb 17 14:33:12 crc kubenswrapper[4804]: I0217 14:33:12.705677 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f97f9545f-tngcj_c7f4e4c3-9ec8-4923-bf7b-4058899e863f/barbican-worker/0.log" Feb 17 14:33:12 crc kubenswrapper[4804]: I0217 14:33:12.824834 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f97f9545f-tngcj_c7f4e4c3-9ec8-4923-bf7b-4058899e863f/barbican-worker-log/0.log" Feb 17 14:33:12 crc kubenswrapper[4804]: I0217 14:33:12.907014 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p_9ee075c2-2363-4446-8545-dfdece6ca4da/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:13 crc kubenswrapper[4804]: I0217 14:33:13.038945 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_39bfc426-b9af-40b4-a713-26bb2366db7a/ceilometer-central-agent/0.log" Feb 17 14:33:13 crc kubenswrapper[4804]: I0217 14:33:13.110734 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_39bfc426-b9af-40b4-a713-26bb2366db7a/ceilometer-notification-agent/0.log" Feb 17 14:33:13 crc kubenswrapper[4804]: I0217 14:33:13.119982 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_39bfc426-b9af-40b4-a713-26bb2366db7a/proxy-httpd/0.log" Feb 17 14:33:13 crc kubenswrapper[4804]: I0217 14:33:13.195382 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_39bfc426-b9af-40b4-a713-26bb2366db7a/sg-core/0.log" Feb 17 14:33:13 crc kubenswrapper[4804]: I0217 14:33:13.337726 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92/cinder-api-log/0.log" Feb 17 14:33:13 crc kubenswrapper[4804]: I0217 14:33:13.416593 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92/cinder-api/0.log" Feb 17 14:33:13 crc kubenswrapper[4804]: I0217 14:33:13.580087 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f7170af0-a08f-4b96-b93a-5353d633a82f/cinder-scheduler/0.log" Feb 17 14:33:13 crc kubenswrapper[4804]: I0217 14:33:13.608820 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f7170af0-a08f-4b96-b93a-5353d633a82f/probe/0.log" Feb 17 14:33:14 crc kubenswrapper[4804]: I0217 14:33:14.083308 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-499xq_5c4e88aa-842f-453a-9ce9-8354c16340e9/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:14 crc kubenswrapper[4804]: I0217 14:33:14.149067 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq_5ca70007-e938-4bd5-9f2a-66f18b87743a/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:14 crc kubenswrapper[4804]: I0217 14:33:14.286341 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-2n5kn_69619ab8-5a40-43b9-8e9c-1a6e39893605/init/0.log" Feb 17 14:33:14 crc kubenswrapper[4804]: I0217 14:33:14.445936 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-2n5kn_69619ab8-5a40-43b9-8e9c-1a6e39893605/init/0.log" Feb 17 14:33:14 crc kubenswrapper[4804]: I0217 14:33:14.553954 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc_5ecc3e55-21c0-4017-8dce-9c77fd2189ea/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:14 crc kubenswrapper[4804]: I0217 14:33:14.575676 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:33:14 crc kubenswrapper[4804]: E0217 14:33:14.575997 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:33:14 crc kubenswrapper[4804]: I0217 14:33:14.587514 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-2n5kn_69619ab8-5a40-43b9-8e9c-1a6e39893605/dnsmasq-dns/0.log" Feb 17 14:33:14 crc kubenswrapper[4804]: I0217 14:33:14.720543 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cc2e7136-825b-4608-a106-944f359c7369/glance-log/0.log" Feb 17 14:33:14 crc kubenswrapper[4804]: I0217 14:33:14.734415 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cc2e7136-825b-4608-a106-944f359c7369/glance-httpd/0.log" Feb 17 14:33:14 crc kubenswrapper[4804]: I0217 14:33:14.887264 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_52f268a5-3c72-4655-bb36-823c34e5312d/glance-httpd/0.log" Feb 17 14:33:14 crc kubenswrapper[4804]: I0217 14:33:14.936562 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_52f268a5-3c72-4655-bb36-823c34e5312d/glance-log/0.log" Feb 17 14:33:15 crc kubenswrapper[4804]: I0217 14:33:15.340277 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-9ffb6f5c6-fczv5_e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f/horizon/0.log" Feb 17 14:33:15 crc kubenswrapper[4804]: I0217 14:33:15.440547 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-65nc8_0a55b597-4920-4fa6-99d5-6deaa6f30a4a/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:15 crc kubenswrapper[4804]: I0217 14:33:15.654026 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-9ffb6f5c6-fczv5_e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f/horizon-log/0.log" Feb 17 14:33:15 crc kubenswrapper[4804]: I0217 14:33:15.668117 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-hx4nm_e9b53a85-8a87-4b65-8832-00c4175da541/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:15 crc kubenswrapper[4804]: I0217 14:33:15.897265 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29522281-k9ptv_c2d1f319-5d08-4969-a968-45eba20958a7/keystone-cron/0.log" Feb 17 14:33:15 crc kubenswrapper[4804]: I0217 14:33:15.960096 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-9cc757857-wng6k_30df70d3-9323-4ddd-9d1c-2dae72cff6d9/keystone-api/0.log" Feb 17 14:33:16 crc kubenswrapper[4804]: I0217 14:33:16.081227 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_d6aabf20-b0bf-4f35-aec7-098f38bacfd9/kube-state-metrics/0.log" Feb 17 14:33:16 crc kubenswrapper[4804]: I0217 14:33:16.173039 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc_c0aad2ba-98cf-42b5-9c03-40633fb8ac18/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:16 crc kubenswrapper[4804]: I0217 14:33:16.445596 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-c576cfd85-655nj_fb86b3d7-c6a3-43d5-a8da-805aa7d73a66/neutron-api/0.log" Feb 17 14:33:16 crc kubenswrapper[4804]: I0217 14:33:16.455952 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-c576cfd85-655nj_fb86b3d7-c6a3-43d5-a8da-805aa7d73a66/neutron-httpd/0.log" Feb 17 14:33:16 crc kubenswrapper[4804]: I0217 14:33:16.567112 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg_84938cd5-694c-423a-a0d1-801f28085377/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:17 crc kubenswrapper[4804]: I0217 14:33:17.115956 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_29528202-42d5-4bcd-90e8-335435ba59cf/nova-api-log/0.log" Feb 17 14:33:17 crc kubenswrapper[4804]: I0217 14:33:17.195362 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_fc78e86d-494e-417b-8569-b564cdbd069a/nova-cell0-conductor-conductor/0.log" Feb 17 14:33:17 crc kubenswrapper[4804]: I0217 14:33:17.435573 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a13dbc73-75fc-448b-af44-cb7018d1640e/nova-cell1-conductor-conductor/0.log" Feb 17 14:33:17 crc kubenswrapper[4804]: I0217 14:33:17.569469 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_29528202-42d5-4bcd-90e8-335435ba59cf/nova-api-api/0.log" Feb 17 14:33:17 crc kubenswrapper[4804]: I0217 14:33:17.598924 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_5c380610-c164-4798-a5df-9b90fd475667/nova-cell1-novncproxy-novncproxy/0.log" Feb 17 14:33:17 crc kubenswrapper[4804]: I0217 14:33:17.711738 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-x8lml_9f17dd92-0402-40c7-bdc7-50b38e37f750/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:17 crc kubenswrapper[4804]: I0217 14:33:17.889627 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ee4c15c1-5fb0-4605-9cb8-69a060ec0d39/nova-metadata-log/0.log" Feb 17 14:33:18 crc kubenswrapper[4804]: I0217 14:33:18.262543 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_1bac289d-58a7-4e23-8805-c48811d12d32/nova-scheduler-scheduler/0.log" Feb 17 14:33:18 crc kubenswrapper[4804]: I0217 14:33:18.299946 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f9eb8e8f-8bd1-4f69-84ee-27213046c709/mysql-bootstrap/0.log" Feb 17 14:33:18 crc kubenswrapper[4804]: I0217 14:33:18.457149 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f9eb8e8f-8bd1-4f69-84ee-27213046c709/mysql-bootstrap/0.log" Feb 17 14:33:18 crc kubenswrapper[4804]: I0217 14:33:18.508237 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f9eb8e8f-8bd1-4f69-84ee-27213046c709/galera/0.log" Feb 17 14:33:18 crc kubenswrapper[4804]: I0217 14:33:18.677692 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_49b02c8f-ff07-48f9-8012-e78dc6591499/mysql-bootstrap/0.log" Feb 17 14:33:18 crc kubenswrapper[4804]: I0217 14:33:18.853803 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_49b02c8f-ff07-48f9-8012-e78dc6591499/galera/0.log" Feb 17 14:33:18 crc kubenswrapper[4804]: I0217 14:33:18.858928 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_49b02c8f-ff07-48f9-8012-e78dc6591499/mysql-bootstrap/0.log" Feb 17 14:33:19 crc kubenswrapper[4804]: I0217 14:33:19.081493 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_de1a53e3-68ce-4ecd-9c0a-80ffce568891/openstackclient/0.log" Feb 17 14:33:19 crc kubenswrapper[4804]: I0217 14:33:19.135933 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-4s7l5_d286aa08-b0df-44e8-9128-f596f4b44db8/openstack-network-exporter/0.log" Feb 17 14:33:19 crc kubenswrapper[4804]: I0217 14:33:19.307808 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ee4c15c1-5fb0-4605-9cb8-69a060ec0d39/nova-metadata-metadata/0.log" Feb 17 14:33:19 crc kubenswrapper[4804]: I0217 14:33:19.308242 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p4wrm_45330d20-989c-4507-ae57-5beaee075484/ovsdb-server-init/0.log" Feb 17 14:33:19 crc kubenswrapper[4804]: I0217 14:33:19.509660 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p4wrm_45330d20-989c-4507-ae57-5beaee075484/ovs-vswitchd/0.log" Feb 17 14:33:19 crc kubenswrapper[4804]: I0217 14:33:19.526431 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p4wrm_45330d20-989c-4507-ae57-5beaee075484/ovsdb-server-init/0.log" Feb 17 14:33:19 crc kubenswrapper[4804]: I0217 14:33:19.612153 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p4wrm_45330d20-989c-4507-ae57-5beaee075484/ovsdb-server/0.log" Feb 17 14:33:19 crc kubenswrapper[4804]: I0217 14:33:19.758819 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-rzcfd_9c049787-03d2-4679-8705-ec2cd1ad8141/ovn-controller/0.log" Feb 17 14:33:19 crc kubenswrapper[4804]: I0217 14:33:19.837769 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-v478m_be98213b-0510-4f69-9d98-81363c04d8bd/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:19 crc kubenswrapper[4804]: I0217 14:33:19.963767 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3e322ccb-33cf-466f-91fb-63781bdcffb6/openstack-network-exporter/0.log" Feb 17 14:33:19 crc kubenswrapper[4804]: I0217 14:33:19.998754 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3e322ccb-33cf-466f-91fb-63781bdcffb6/ovn-northd/0.log" Feb 17 14:33:20 crc kubenswrapper[4804]: I0217 14:33:20.116753 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0fc5c8da-b323-4afb-aa47-125fc63caefd/openstack-network-exporter/0.log" Feb 17 14:33:20 crc kubenswrapper[4804]: I0217 14:33:20.199384 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0fc5c8da-b323-4afb-aa47-125fc63caefd/ovsdbserver-nb/0.log" Feb 17 14:33:20 crc kubenswrapper[4804]: I0217 14:33:20.328834 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_10e1124a-f402-422d-a906-8d22c90d4abe/ovsdbserver-sb/0.log" Feb 17 14:33:20 crc kubenswrapper[4804]: I0217 14:33:20.366065 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_10e1124a-f402-422d-a906-8d22c90d4abe/openstack-network-exporter/0.log" Feb 17 14:33:20 crc kubenswrapper[4804]: I0217 14:33:20.522635 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d69649784-lnwhw_858d67cb-268b-4724-bba9-a7ab9a10ed6c/placement-api/0.log" Feb 17 14:33:20 crc kubenswrapper[4804]: I0217 14:33:20.616413 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d69649784-lnwhw_858d67cb-268b-4724-bba9-a7ab9a10ed6c/placement-log/0.log" Feb 17 14:33:20 crc kubenswrapper[4804]: I0217 14:33:20.686122 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4c7ecd09-cd15-439d-9153-b55d9013bb83/setup-container/0.log" Feb 17 14:33:20 crc kubenswrapper[4804]: I0217 14:33:20.887351 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b5f204e4-3b7a-4490-9c78-def5bf30f810/setup-container/0.log" Feb 17 14:33:20 crc kubenswrapper[4804]: I0217 14:33:20.892426 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4c7ecd09-cd15-439d-9153-b55d9013bb83/setup-container/0.log" Feb 17 14:33:20 crc kubenswrapper[4804]: I0217 14:33:20.927637 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4c7ecd09-cd15-439d-9153-b55d9013bb83/rabbitmq/0.log" Feb 17 14:33:21 crc kubenswrapper[4804]: I0217 14:33:21.097575 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b5f204e4-3b7a-4490-9c78-def5bf30f810/rabbitmq/0.log" Feb 17 14:33:21 crc kubenswrapper[4804]: I0217 14:33:21.176431 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b5f204e4-3b7a-4490-9c78-def5bf30f810/setup-container/0.log" Feb 17 14:33:21 crc kubenswrapper[4804]: I0217 14:33:21.223207 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66_100d84c5-396c-4772-af09-2e223e72a640/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:21 crc kubenswrapper[4804]: I0217 14:33:21.963109 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-z6s9f_c87b0376-c505-452b-90ed-0e6bb7e6e8e0/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.003083 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-zctst_ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.156449 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-rf97c_01fe0e44-6604-4e17-bcb4-05f202508fc7/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.246786 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-9jrnh_cdb9b3eb-f3d1-4a32-8a87-b0f686cad260/ssh-known-hosts-edpm-deployment/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.477107 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-59cfdfc65f-48l6n_be0372d3-4646-46e7-af04-6977a7426f35/proxy-server/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.507040 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-59cfdfc65f-48l6n_be0372d3-4646-46e7-af04-6977a7426f35/proxy-httpd/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.548368 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mv8w5_41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2/swift-ring-rebalance/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.734874 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/account-reaper/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.752412 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/account-auditor/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.820340 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/account-replicator/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.948017 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/container-auditor/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.950360 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/account-server/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.999258 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/container-replicator/0.log" Feb 17 14:33:23 crc kubenswrapper[4804]: I0217 14:33:23.048125 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/container-server/0.log" Feb 17 14:33:23 crc kubenswrapper[4804]: I0217 14:33:23.156246 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/object-auditor/0.log" Feb 17 14:33:23 crc kubenswrapper[4804]: I0217 14:33:23.156847 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/container-updater/0.log" Feb 17 14:33:23 crc kubenswrapper[4804]: I0217 14:33:23.210485 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/object-expirer/0.log" Feb 17 14:33:23 crc kubenswrapper[4804]: I0217 14:33:23.776137 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/object-server/0.log" Feb 17 14:33:23 crc kubenswrapper[4804]: I0217 14:33:23.780609 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/object-updater/0.log" Feb 17 14:33:23 crc kubenswrapper[4804]: I0217 14:33:23.784973 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/object-replicator/0.log" Feb 17 14:33:23 crc kubenswrapper[4804]: I0217 14:33:23.800300 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/rsync/0.log" Feb 17 14:33:24 crc kubenswrapper[4804]: I0217 14:33:24.002428 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/swift-recon-cron/0.log" Feb 17 14:33:24 crc kubenswrapper[4804]: I0217 14:33:24.077451 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-wtq55_0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:24 crc kubenswrapper[4804]: I0217 14:33:24.225509 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_f7b246dc-1d07-4725-b471-88fe82584d24/tempest-tests-tempest-tests-runner/0.log" Feb 17 14:33:24 crc kubenswrapper[4804]: I0217 14:33:24.269376 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_4c6dcbcb-8248-40b5-8fd6-7824c487109e/test-operator-logs-container/0.log" Feb 17 14:33:24 crc kubenswrapper[4804]: I0217 14:33:24.485339 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb_ed6642bc-b49f-4e17-a721-b3eae09246aa/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:29 crc kubenswrapper[4804]: I0217 14:33:29.575193 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:33:29 crc kubenswrapper[4804]: E0217 14:33:29.576676 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.286296 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5z45j"] Feb 17 14:33:35 crc kubenswrapper[4804]: E0217 14:33:35.292443 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01160288-3510-4001-8a02-c356f2b354f1" containerName="container-00" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.292470 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="01160288-3510-4001-8a02-c356f2b354f1" containerName="container-00" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.292729 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="01160288-3510-4001-8a02-c356f2b354f1" containerName="container-00" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.294685 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.301649 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5z45j"] Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.388744 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-catalog-content\") pod \"redhat-marketplace-5z45j\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.388794 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-utilities\") pod \"redhat-marketplace-5z45j\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.388813 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txwcj\" (UniqueName: \"kubernetes.io/projected/9ee4631f-2436-4b96-bb8c-4137382e12aa-kube-api-access-txwcj\") pod \"redhat-marketplace-5z45j\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.427888 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f5ef96d0-19a6-4561-bde2-cf38e0280b39/memcached/0.log" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.490979 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-catalog-content\") pod \"redhat-marketplace-5z45j\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.491058 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-utilities\") pod \"redhat-marketplace-5z45j\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.491085 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txwcj\" (UniqueName: \"kubernetes.io/projected/9ee4631f-2436-4b96-bb8c-4137382e12aa-kube-api-access-txwcj\") pod \"redhat-marketplace-5z45j\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.491663 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-catalog-content\") pod \"redhat-marketplace-5z45j\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.491818 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-utilities\") pod \"redhat-marketplace-5z45j\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.529091 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txwcj\" (UniqueName: \"kubernetes.io/projected/9ee4631f-2436-4b96-bb8c-4137382e12aa-kube-api-access-txwcj\") pod \"redhat-marketplace-5z45j\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.651385 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:36 crc kubenswrapper[4804]: I0217 14:33:36.165166 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5z45j"] Feb 17 14:33:36 crc kubenswrapper[4804]: I0217 14:33:36.581387 4804 generic.go:334] "Generic (PLEG): container finished" podID="9ee4631f-2436-4b96-bb8c-4137382e12aa" containerID="15694f37ee9b5bb7a000790ce1d88106ede67c90ce06bde635c4327ae7320ea7" exitCode=0 Feb 17 14:33:36 crc kubenswrapper[4804]: I0217 14:33:36.582625 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z45j" event={"ID":"9ee4631f-2436-4b96-bb8c-4137382e12aa","Type":"ContainerDied","Data":"15694f37ee9b5bb7a000790ce1d88106ede67c90ce06bde635c4327ae7320ea7"} Feb 17 14:33:36 crc kubenswrapper[4804]: I0217 14:33:36.582659 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z45j" event={"ID":"9ee4631f-2436-4b96-bb8c-4137382e12aa","Type":"ContainerStarted","Data":"d0388e8ed5eb0bf260d3d3be7512607aa673bcc17891d6b49bd3ddf27df1381b"} Feb 17 14:33:37 crc kubenswrapper[4804]: I0217 14:33:37.594224 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z45j" event={"ID":"9ee4631f-2436-4b96-bb8c-4137382e12aa","Type":"ContainerStarted","Data":"082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65"} Feb 17 14:33:38 crc kubenswrapper[4804]: I0217 14:33:38.605376 4804 generic.go:334] "Generic (PLEG): container finished" podID="9ee4631f-2436-4b96-bb8c-4137382e12aa" containerID="082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65" exitCode=0 Feb 17 14:33:38 crc kubenswrapper[4804]: I0217 14:33:38.605492 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z45j" event={"ID":"9ee4631f-2436-4b96-bb8c-4137382e12aa","Type":"ContainerDied","Data":"082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65"} Feb 17 14:33:40 crc kubenswrapper[4804]: I0217 14:33:40.625464 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z45j" event={"ID":"9ee4631f-2436-4b96-bb8c-4137382e12aa","Type":"ContainerStarted","Data":"874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba"} Feb 17 14:33:40 crc kubenswrapper[4804]: I0217 14:33:40.645797 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5z45j" podStartSLOduration=3.246874107 podStartE2EDuration="5.645777225s" podCreationTimestamp="2026-02-17 14:33:35 +0000 UTC" firstStartedPulling="2026-02-17 14:33:36.582769929 +0000 UTC m=+4090.694189256" lastFinishedPulling="2026-02-17 14:33:38.981673037 +0000 UTC m=+4093.093092374" observedRunningTime="2026-02-17 14:33:40.640340865 +0000 UTC m=+4094.751760202" watchObservedRunningTime="2026-02-17 14:33:40.645777225 +0000 UTC m=+4094.757196562" Feb 17 14:33:43 crc kubenswrapper[4804]: I0217 14:33:43.574436 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:33:43 crc kubenswrapper[4804]: E0217 14:33:43.575532 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:33:45 crc kubenswrapper[4804]: I0217 14:33:45.652393 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:45 crc kubenswrapper[4804]: I0217 14:33:45.652710 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:45 crc kubenswrapper[4804]: I0217 14:33:45.706726 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:45 crc kubenswrapper[4804]: I0217 14:33:45.750321 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.057834 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5z45j"] Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.059314 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5z45j" podUID="9ee4631f-2436-4b96-bb8c-4137382e12aa" containerName="registry-server" containerID="cri-o://874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba" gracePeriod=2 Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.565402 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.688657 4804 generic.go:334] "Generic (PLEG): container finished" podID="9ee4631f-2436-4b96-bb8c-4137382e12aa" containerID="874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba" exitCode=0 Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.688695 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z45j" event={"ID":"9ee4631f-2436-4b96-bb8c-4137382e12aa","Type":"ContainerDied","Data":"874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba"} Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.688724 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.689034 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z45j" event={"ID":"9ee4631f-2436-4b96-bb8c-4137382e12aa","Type":"ContainerDied","Data":"d0388e8ed5eb0bf260d3d3be7512607aa673bcc17891d6b49bd3ddf27df1381b"} Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.689060 4804 scope.go:117] "RemoveContainer" containerID="874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.720616 4804 scope.go:117] "RemoveContainer" containerID="082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.738489 4804 scope.go:117] "RemoveContainer" containerID="15694f37ee9b5bb7a000790ce1d88106ede67c90ce06bde635c4327ae7320ea7" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.752000 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-catalog-content\") pod \"9ee4631f-2436-4b96-bb8c-4137382e12aa\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.752154 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-utilities\") pod \"9ee4631f-2436-4b96-bb8c-4137382e12aa\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.752186 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txwcj\" (UniqueName: \"kubernetes.io/projected/9ee4631f-2436-4b96-bb8c-4137382e12aa-kube-api-access-txwcj\") pod \"9ee4631f-2436-4b96-bb8c-4137382e12aa\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.753029 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-utilities" (OuterVolumeSpecName: "utilities") pod "9ee4631f-2436-4b96-bb8c-4137382e12aa" (UID: "9ee4631f-2436-4b96-bb8c-4137382e12aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.758367 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ee4631f-2436-4b96-bb8c-4137382e12aa-kube-api-access-txwcj" (OuterVolumeSpecName: "kube-api-access-txwcj") pod "9ee4631f-2436-4b96-bb8c-4137382e12aa" (UID: "9ee4631f-2436-4b96-bb8c-4137382e12aa"). InnerVolumeSpecName "kube-api-access-txwcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.779895 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ee4631f-2436-4b96-bb8c-4137382e12aa" (UID: "9ee4631f-2436-4b96-bb8c-4137382e12aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.831073 4804 scope.go:117] "RemoveContainer" containerID="874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba" Feb 17 14:33:48 crc kubenswrapper[4804]: E0217 14:33:48.831360 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba\": container with ID starting with 874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba not found: ID does not exist" containerID="874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.831407 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba"} err="failed to get container status \"874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba\": rpc error: code = NotFound desc = could not find container \"874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba\": container with ID starting with 874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba not found: ID does not exist" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.831427 4804 scope.go:117] "RemoveContainer" containerID="082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65" Feb 17 14:33:48 crc kubenswrapper[4804]: E0217 14:33:48.831789 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65\": container with ID starting with 082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65 not found: ID does not exist" containerID="082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.831842 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65"} err="failed to get container status \"082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65\": rpc error: code = NotFound desc = could not find container \"082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65\": container with ID starting with 082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65 not found: ID does not exist" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.831876 4804 scope.go:117] "RemoveContainer" containerID="15694f37ee9b5bb7a000790ce1d88106ede67c90ce06bde635c4327ae7320ea7" Feb 17 14:33:48 crc kubenswrapper[4804]: E0217 14:33:48.832218 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15694f37ee9b5bb7a000790ce1d88106ede67c90ce06bde635c4327ae7320ea7\": container with ID starting with 15694f37ee9b5bb7a000790ce1d88106ede67c90ce06bde635c4327ae7320ea7 not found: ID does not exist" containerID="15694f37ee9b5bb7a000790ce1d88106ede67c90ce06bde635c4327ae7320ea7" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.832250 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15694f37ee9b5bb7a000790ce1d88106ede67c90ce06bde635c4327ae7320ea7"} err="failed to get container status \"15694f37ee9b5bb7a000790ce1d88106ede67c90ce06bde635c4327ae7320ea7\": rpc error: code = NotFound desc = could not find container \"15694f37ee9b5bb7a000790ce1d88106ede67c90ce06bde635c4327ae7320ea7\": container with ID starting with 15694f37ee9b5bb7a000790ce1d88106ede67c90ce06bde635c4327ae7320ea7 not found: ID does not exist" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.854278 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.854315 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.854332 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txwcj\" (UniqueName: \"kubernetes.io/projected/9ee4631f-2436-4b96-bb8c-4137382e12aa-kube-api-access-txwcj\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:49 crc kubenswrapper[4804]: I0217 14:33:49.018811 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5z45j"] Feb 17 14:33:49 crc kubenswrapper[4804]: I0217 14:33:49.027809 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5z45j"] Feb 17 14:33:50 crc kubenswrapper[4804]: I0217 14:33:50.599337 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ee4631f-2436-4b96-bb8c-4137382e12aa" path="/var/lib/kubelet/pods/9ee4631f-2436-4b96-bb8c-4137382e12aa/volumes" Feb 17 14:33:54 crc kubenswrapper[4804]: I0217 14:33:54.186847 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/util/0.log" Feb 17 14:33:54 crc kubenswrapper[4804]: I0217 14:33:54.385395 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/pull/0.log" Feb 17 14:33:54 crc kubenswrapper[4804]: I0217 14:33:54.396024 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/pull/0.log" Feb 17 14:33:54 crc kubenswrapper[4804]: I0217 14:33:54.412792 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/util/0.log" Feb 17 14:33:54 crc kubenswrapper[4804]: I0217 14:33:54.624485 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/pull/0.log" Feb 17 14:33:54 crc kubenswrapper[4804]: I0217 14:33:54.624518 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/extract/0.log" Feb 17 14:33:54 crc kubenswrapper[4804]: I0217 14:33:54.647093 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/util/0.log" Feb 17 14:33:55 crc kubenswrapper[4804]: I0217 14:33:55.071478 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-55cc45767f-bslfv_fbc5e6cd-47c6-4199-a0f2-e4292a836fac/manager/0.log" Feb 17 14:33:55 crc kubenswrapper[4804]: I0217 14:33:55.434693 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68c6d499cb-vt6zw_5796dc62-bd84-48b7-9c4c-7d5bf1f7e984/manager/0.log" Feb 17 14:33:55 crc kubenswrapper[4804]: I0217 14:33:55.659867 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-9595d6797-sxtr2_5727ae12-4720-4470-b5cc-8b8ae81c2af7/manager/0.log" Feb 17 14:33:55 crc kubenswrapper[4804]: I0217 14:33:55.901291 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54fb488b88-t6hlr_5fa66dc5-a518-40dd-a4b5-dd2b34425ad5/manager/0.log" Feb 17 14:33:56 crc kubenswrapper[4804]: I0217 14:33:56.344294 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6494cdbf8f-cdpkr_07b97973-fa08-4b79-9164-918a4d04f8b7/manager/0.log" Feb 17 14:33:56 crc kubenswrapper[4804]: I0217 14:33:56.528752 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-66d6b5f488-lrjgg_bf13099a-fbab-41bf-b30c-5c6b1049af19/manager/0.log" Feb 17 14:33:57 crc kubenswrapper[4804]: I0217 14:33:57.017025 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-6c78d668d5-pddsh_430279ab-ba2f-4838-ab07-b851d4df84a0/manager/0.log" Feb 17 14:33:57 crc kubenswrapper[4804]: I0217 14:33:57.192269 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-57746b5ff9-wn64m_0b746a42-c0b4-4cb9-9352-3623669bad5a/manager/0.log" Feb 17 14:33:57 crc kubenswrapper[4804]: I0217 14:33:57.278019 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-96fff9cb8-88sh4_d3332002-6930-418f-8288-e8344be70c6a/manager/0.log" Feb 17 14:33:57 crc kubenswrapper[4804]: I0217 14:33:57.426114 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-66997756f6-vkdg2_2546387a-6a42-4f8d-a321-2f9cbaa11adb/manager/0.log" Feb 17 14:33:57 crc kubenswrapper[4804]: I0217 14:33:57.491746 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54967dbbdf-l5cl2_97925efc-eb46-4a60-b372-b31f13a2c876/manager/0.log" Feb 17 14:33:57 crc kubenswrapper[4804]: I0217 14:33:57.760190 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5ddd85db87-c8hmm_36b1ca46-becb-417e-b05e-777d40246cb6/manager/0.log" Feb 17 14:33:58 crc kubenswrapper[4804]: I0217 14:33:58.067296 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88_ae7598b8-fff5-4044-bbd7-0c8f2f60eed8/manager/0.log" Feb 17 14:33:58 crc kubenswrapper[4804]: I0217 14:33:58.442711 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7cb8c4979f-kfx9x_f69fc148-3a8b-4065-b075-85ecad8339e7/operator/0.log" Feb 17 14:33:58 crc kubenswrapper[4804]: I0217 14:33:58.578879 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:33:58 crc kubenswrapper[4804]: E0217 14:33:58.579189 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:33:58 crc kubenswrapper[4804]: I0217 14:33:58.772046 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-55nc6_13d9e436-3cb0-4df0-aaf9-e614eba74c89/registry-server/0.log" Feb 17 14:33:59 crc kubenswrapper[4804]: I0217 14:33:59.507289 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-85c99d655-ltwrc_ac1e20c8-4527-4bba-85bd-2154e1244d3e/manager/0.log" Feb 17 14:33:59 crc kubenswrapper[4804]: I0217 14:33:59.638465 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57bd55f9b7-9vbg5_42505b9c-f878-4feb-b9a1-9dfa11ec0f56/manager/0.log" Feb 17 14:33:59 crc kubenswrapper[4804]: I0217 14:33:59.898896 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-rtlpm_44ec973d-9403-48f4-b92c-72f0bd485b0f/operator/0.log" Feb 17 14:34:00 crc kubenswrapper[4804]: I0217 14:34:00.115035 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-79558bbfbf-n6fl9_f94e791f-16fd-4364-a246-35bcca0d14e6/manager/0.log" Feb 17 14:34:00 crc kubenswrapper[4804]: I0217 14:34:00.454973 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-56dc67d744-rbrxl_067b67c8-64c5-4c21-b1b1-770aa68e0eb7/manager/0.log" Feb 17 14:34:00 crc kubenswrapper[4804]: I0217 14:34:00.533438 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-745bbbd77b-ptrs5_79eb8fb0-6207-44c8-b3c2-a00116bcf10b/manager/0.log" Feb 17 14:34:00 crc kubenswrapper[4804]: I0217 14:34:00.557376 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5744df64c-mkkrv_8155784a-3945-4ca3-aa9a-b0e089ffac52/manager/0.log" Feb 17 14:34:00 crc kubenswrapper[4804]: I0217 14:34:00.672503 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-8467ccb4c8-nwmk5_1c7ad838-6225-4001-899a-7f741cb75f2f/manager/0.log" Feb 17 14:34:01 crc kubenswrapper[4804]: I0217 14:34:01.300863 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c469bc6bb-xlwmb_57038414-fcca-4a2a-8756-46f97cc57d81/manager/0.log" Feb 17 14:34:05 crc kubenswrapper[4804]: I0217 14:34:05.472573 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-c4b7d6946-4xvfg_545c7d25-7774-4c62-89b8-f491fd4065e8/manager/0.log" Feb 17 14:34:11 crc kubenswrapper[4804]: I0217 14:34:11.574140 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:34:11 crc kubenswrapper[4804]: E0217 14:34:11.574860 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:34:23 crc kubenswrapper[4804]: I0217 14:34:23.528700 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-t4m4g_6c98dfab-f166-4eb4-b385-724d6b9b9d7a/control-plane-machine-set-operator/0.log" Feb 17 14:34:23 crc kubenswrapper[4804]: I0217 14:34:23.574447 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:34:23 crc kubenswrapper[4804]: E0217 14:34:23.574760 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:34:23 crc kubenswrapper[4804]: I0217 14:34:23.592904 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-spfls_17c8a131-fc0e-44b5-b374-846e6b2aeb1c/kube-rbac-proxy/0.log" Feb 17 14:34:23 crc kubenswrapper[4804]: I0217 14:34:23.701309 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-spfls_17c8a131-fc0e-44b5-b374-846e6b2aeb1c/machine-api-operator/0.log" Feb 17 14:34:35 crc kubenswrapper[4804]: I0217 14:34:35.514262 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-7sfkb_112c357f-f1dc-4a07-bba0-ddf54ab071ff/cert-manager-controller/0.log" Feb 17 14:34:35 crc kubenswrapper[4804]: I0217 14:34:35.766929 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-kbdz5_9d2d8008-6348-4f24-8085-d30db8558ab3/cert-manager-cainjector/0.log" Feb 17 14:34:35 crc kubenswrapper[4804]: I0217 14:34:35.787583 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-c8nh8_be70f757-4537-489d-a86e-a1b49fc9af75/cert-manager-webhook/0.log" Feb 17 14:34:36 crc kubenswrapper[4804]: I0217 14:34:36.580364 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:34:37 crc kubenswrapper[4804]: I0217 14:34:37.164795 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"9de6f8932aa6eb9745d883c27729ffc7b5e517e2da23231504aaed3b733a9ff5"} Feb 17 14:34:49 crc kubenswrapper[4804]: I0217 14:34:49.795230 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-bgf7w_2158c202-5aa4-47aa-87a1-73e4b9043e78/nmstate-console-plugin/0.log" Feb 17 14:34:49 crc kubenswrapper[4804]: I0217 14:34:49.998177 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-jxn7r_81e46a71-360c-4509-ad38-2b2c814a56c2/nmstate-handler/0.log" Feb 17 14:34:50 crc kubenswrapper[4804]: I0217 14:34:50.028612 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-8gkbz_18e3c061-8633-471f-b2ab-e87e3c0b5d44/nmstate-metrics/0.log" Feb 17 14:34:50 crc kubenswrapper[4804]: I0217 14:34:50.034566 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-8gkbz_18e3c061-8633-471f-b2ab-e87e3c0b5d44/kube-rbac-proxy/0.log" Feb 17 14:34:50 crc kubenswrapper[4804]: I0217 14:34:50.216363 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-rkf7s_2789dcb9-5619-4986-a692-1eec733c97ff/nmstate-operator/0.log" Feb 17 14:34:50 crc kubenswrapper[4804]: I0217 14:34:50.236817 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-dbfqz_36fd4ae3-048e-4e51-b2fa-875a5c84b8e0/nmstate-webhook/0.log" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.315935 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mxm8x"] Feb 17 14:35:01 crc kubenswrapper[4804]: E0217 14:35:01.317170 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee4631f-2436-4b96-bb8c-4137382e12aa" containerName="extract-utilities" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.317186 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee4631f-2436-4b96-bb8c-4137382e12aa" containerName="extract-utilities" Feb 17 14:35:01 crc kubenswrapper[4804]: E0217 14:35:01.317217 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee4631f-2436-4b96-bb8c-4137382e12aa" containerName="extract-content" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.317225 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee4631f-2436-4b96-bb8c-4137382e12aa" containerName="extract-content" Feb 17 14:35:01 crc kubenswrapper[4804]: E0217 14:35:01.317261 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee4631f-2436-4b96-bb8c-4137382e12aa" containerName="registry-server" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.317269 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee4631f-2436-4b96-bb8c-4137382e12aa" containerName="registry-server" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.317526 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ee4631f-2436-4b96-bb8c-4137382e12aa" containerName="registry-server" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.319160 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.327837 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxm8x"] Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.411514 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2zd8\" (UniqueName: \"kubernetes.io/projected/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-kube-api-access-j2zd8\") pod \"community-operators-mxm8x\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.411622 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-catalog-content\") pod \"community-operators-mxm8x\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.411930 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-utilities\") pod \"community-operators-mxm8x\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.514122 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2zd8\" (UniqueName: \"kubernetes.io/projected/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-kube-api-access-j2zd8\") pod \"community-operators-mxm8x\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.514171 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-catalog-content\") pod \"community-operators-mxm8x\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.514258 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-utilities\") pod \"community-operators-mxm8x\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.514930 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-catalog-content\") pod \"community-operators-mxm8x\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.515009 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-utilities\") pod \"community-operators-mxm8x\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.859469 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2zd8\" (UniqueName: \"kubernetes.io/projected/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-kube-api-access-j2zd8\") pod \"community-operators-mxm8x\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.937789 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:02 crc kubenswrapper[4804]: I0217 14:35:02.368490 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxm8x"] Feb 17 14:35:02 crc kubenswrapper[4804]: I0217 14:35:02.397628 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxm8x" event={"ID":"2a8f57b2-1e50-4720-b9ec-832cc2e41c21","Type":"ContainerStarted","Data":"ca5f5f0230bc282ba835999b8fdde5d2b83e78fcf1ef6d89ebdf7902bbe288d1"} Feb 17 14:35:03 crc kubenswrapper[4804]: I0217 14:35:03.406844 4804 generic.go:334] "Generic (PLEG): container finished" podID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" containerID="16bd6d08c8ac45d420e0b3f925e8936f91020c836bb90671fb2dda02634b13c6" exitCode=0 Feb 17 14:35:03 crc kubenswrapper[4804]: I0217 14:35:03.407061 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxm8x" event={"ID":"2a8f57b2-1e50-4720-b9ec-832cc2e41c21","Type":"ContainerDied","Data":"16bd6d08c8ac45d420e0b3f925e8936f91020c836bb90671fb2dda02634b13c6"} Feb 17 14:35:05 crc kubenswrapper[4804]: I0217 14:35:05.422209 4804 generic.go:334] "Generic (PLEG): container finished" podID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" containerID="8985e5f7b194121e8dd9023d1e3d791cdf662048d6b8ea179bd1ac90bf008c4c" exitCode=0 Feb 17 14:35:05 crc kubenswrapper[4804]: I0217 14:35:05.422341 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxm8x" event={"ID":"2a8f57b2-1e50-4720-b9ec-832cc2e41c21","Type":"ContainerDied","Data":"8985e5f7b194121e8dd9023d1e3d791cdf662048d6b8ea179bd1ac90bf008c4c"} Feb 17 14:35:06 crc kubenswrapper[4804]: I0217 14:35:06.436590 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxm8x" event={"ID":"2a8f57b2-1e50-4720-b9ec-832cc2e41c21","Type":"ContainerStarted","Data":"92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64"} Feb 17 14:35:06 crc kubenswrapper[4804]: I0217 14:35:06.469505 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mxm8x" podStartSLOduration=3.038177591 podStartE2EDuration="5.469482854s" podCreationTimestamp="2026-02-17 14:35:01 +0000 UTC" firstStartedPulling="2026-02-17 14:35:03.409363334 +0000 UTC m=+4177.520782671" lastFinishedPulling="2026-02-17 14:35:05.840668597 +0000 UTC m=+4179.952087934" observedRunningTime="2026-02-17 14:35:06.463715213 +0000 UTC m=+4180.575134540" watchObservedRunningTime="2026-02-17 14:35:06.469482854 +0000 UTC m=+4180.580902191" Feb 17 14:35:11 crc kubenswrapper[4804]: I0217 14:35:11.938296 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:11 crc kubenswrapper[4804]: I0217 14:35:11.938682 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:11 crc kubenswrapper[4804]: I0217 14:35:11.991318 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:12 crc kubenswrapper[4804]: I0217 14:35:12.537264 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:13 crc kubenswrapper[4804]: I0217 14:35:13.509325 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxm8x"] Feb 17 14:35:14 crc kubenswrapper[4804]: I0217 14:35:14.504244 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mxm8x" podUID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" containerName="registry-server" containerID="cri-o://92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64" gracePeriod=2 Feb 17 14:35:14 crc kubenswrapper[4804]: I0217 14:35:14.968650 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.069521 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2zd8\" (UniqueName: \"kubernetes.io/projected/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-kube-api-access-j2zd8\") pod \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.069620 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-utilities\") pod \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.069663 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-catalog-content\") pod \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.070612 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-utilities" (OuterVolumeSpecName: "utilities") pod "2a8f57b2-1e50-4720-b9ec-832cc2e41c21" (UID: "2a8f57b2-1e50-4720-b9ec-832cc2e41c21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.076271 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-kube-api-access-j2zd8" (OuterVolumeSpecName: "kube-api-access-j2zd8") pod "2a8f57b2-1e50-4720-b9ec-832cc2e41c21" (UID: "2a8f57b2-1e50-4720-b9ec-832cc2e41c21"). InnerVolumeSpecName "kube-api-access-j2zd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.125349 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a8f57b2-1e50-4720-b9ec-832cc2e41c21" (UID: "2a8f57b2-1e50-4720-b9ec-832cc2e41c21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.171775 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2zd8\" (UniqueName: \"kubernetes.io/projected/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-kube-api-access-j2zd8\") on node \"crc\" DevicePath \"\"" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.171831 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.171847 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.515278 4804 generic.go:334] "Generic (PLEG): container finished" podID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" containerID="92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64" exitCode=0 Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.515328 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxm8x" event={"ID":"2a8f57b2-1e50-4720-b9ec-832cc2e41c21","Type":"ContainerDied","Data":"92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64"} Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.515351 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.515364 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxm8x" event={"ID":"2a8f57b2-1e50-4720-b9ec-832cc2e41c21","Type":"ContainerDied","Data":"ca5f5f0230bc282ba835999b8fdde5d2b83e78fcf1ef6d89ebdf7902bbe288d1"} Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.515386 4804 scope.go:117] "RemoveContainer" containerID="92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.532320 4804 scope.go:117] "RemoveContainer" containerID="8985e5f7b194121e8dd9023d1e3d791cdf662048d6b8ea179bd1ac90bf008c4c" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.553067 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxm8x"] Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.556277 4804 scope.go:117] "RemoveContainer" containerID="16bd6d08c8ac45d420e0b3f925e8936f91020c836bb90671fb2dda02634b13c6" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.561019 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mxm8x"] Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.598744 4804 scope.go:117] "RemoveContainer" containerID="92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64" Feb 17 14:35:15 crc kubenswrapper[4804]: E0217 14:35:15.599248 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64\": container with ID starting with 92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64 not found: ID does not exist" containerID="92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.599347 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64"} err="failed to get container status \"92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64\": rpc error: code = NotFound desc = could not find container \"92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64\": container with ID starting with 92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64 not found: ID does not exist" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.599426 4804 scope.go:117] "RemoveContainer" containerID="8985e5f7b194121e8dd9023d1e3d791cdf662048d6b8ea179bd1ac90bf008c4c" Feb 17 14:35:15 crc kubenswrapper[4804]: E0217 14:35:15.599735 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8985e5f7b194121e8dd9023d1e3d791cdf662048d6b8ea179bd1ac90bf008c4c\": container with ID starting with 8985e5f7b194121e8dd9023d1e3d791cdf662048d6b8ea179bd1ac90bf008c4c not found: ID does not exist" containerID="8985e5f7b194121e8dd9023d1e3d791cdf662048d6b8ea179bd1ac90bf008c4c" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.599829 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8985e5f7b194121e8dd9023d1e3d791cdf662048d6b8ea179bd1ac90bf008c4c"} err="failed to get container status \"8985e5f7b194121e8dd9023d1e3d791cdf662048d6b8ea179bd1ac90bf008c4c\": rpc error: code = NotFound desc = could not find container \"8985e5f7b194121e8dd9023d1e3d791cdf662048d6b8ea179bd1ac90bf008c4c\": container with ID starting with 8985e5f7b194121e8dd9023d1e3d791cdf662048d6b8ea179bd1ac90bf008c4c not found: ID does not exist" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.599911 4804 scope.go:117] "RemoveContainer" containerID="16bd6d08c8ac45d420e0b3f925e8936f91020c836bb90671fb2dda02634b13c6" Feb 17 14:35:15 crc kubenswrapper[4804]: E0217 14:35:15.600192 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16bd6d08c8ac45d420e0b3f925e8936f91020c836bb90671fb2dda02634b13c6\": container with ID starting with 16bd6d08c8ac45d420e0b3f925e8936f91020c836bb90671fb2dda02634b13c6 not found: ID does not exist" containerID="16bd6d08c8ac45d420e0b3f925e8936f91020c836bb90671fb2dda02634b13c6" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.600315 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16bd6d08c8ac45d420e0b3f925e8936f91020c836bb90671fb2dda02634b13c6"} err="failed to get container status \"16bd6d08c8ac45d420e0b3f925e8936f91020c836bb90671fb2dda02634b13c6\": rpc error: code = NotFound desc = could not find container \"16bd6d08c8ac45d420e0b3f925e8936f91020c836bb90671fb2dda02634b13c6\": container with ID starting with 16bd6d08c8ac45d420e0b3f925e8936f91020c836bb90671fb2dda02634b13c6 not found: ID does not exist" Feb 17 14:35:16 crc kubenswrapper[4804]: I0217 14:35:16.587351 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" path="/var/lib/kubelet/pods/2a8f57b2-1e50-4720-b9ec-832cc2e41c21/volumes" Feb 17 14:35:18 crc kubenswrapper[4804]: I0217 14:35:18.677902 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-wg4pd_01625c42-e1b1-470d-b705-47b30fec457a/kube-rbac-proxy/0.log" Feb 17 14:35:18 crc kubenswrapper[4804]: I0217 14:35:18.820497 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-wg4pd_01625c42-e1b1-470d-b705-47b30fec457a/controller/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.014740 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-frr-files/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.231995 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-frr-files/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.241177 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-reloader/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.255450 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-metrics/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.263310 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-reloader/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.472342 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-metrics/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.493923 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-reloader/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.514321 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-frr-files/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.560710 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-metrics/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.672964 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-frr-files/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.680454 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-reloader/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.715748 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-metrics/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.777227 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/controller/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.891015 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/frr-metrics/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.932078 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/kube-rbac-proxy/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.981732 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/kube-rbac-proxy-frr/0.log" Feb 17 14:35:20 crc kubenswrapper[4804]: I0217 14:35:20.130244 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/reloader/0.log" Feb 17 14:35:20 crc kubenswrapper[4804]: I0217 14:35:20.260261 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-gl8tp_0d003d1c-2370-4291-a035-0ebe8b97cfee/frr-k8s-webhook-server/0.log" Feb 17 14:35:20 crc kubenswrapper[4804]: I0217 14:35:20.419595 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-c7c468df9-kbjlb_c17333d4-cfc6-4129-af9e-a8f2db54988b/manager/0.log" Feb 17 14:35:20 crc kubenswrapper[4804]: I0217 14:35:20.654694 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-996ff79d9-vm8dt_82716046-7f15-43d7-b9de-8fdb68a44c0b/webhook-server/0.log" Feb 17 14:35:20 crc kubenswrapper[4804]: I0217 14:35:20.792028 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wrsrf_ef60181c-19a6-454c-a197-2b0af0ac2edf/kube-rbac-proxy/0.log" Feb 17 14:35:21 crc kubenswrapper[4804]: I0217 14:35:21.370266 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wrsrf_ef60181c-19a6-454c-a197-2b0af0ac2edf/speaker/0.log" Feb 17 14:35:21 crc kubenswrapper[4804]: I0217 14:35:21.469021 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/frr/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.009353 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/util/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.209495 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/util/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.216792 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/pull/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.250667 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/pull/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.442150 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/pull/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.450752 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/extract/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.456007 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/util/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.647449 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-utilities/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.796987 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-content/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.800269 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-utilities/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.817658 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-content/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.982063 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-utilities/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.987890 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-content/0.log" Feb 17 14:35:35 crc kubenswrapper[4804]: I0217 14:35:35.203945 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-utilities/0.log" Feb 17 14:35:35 crc kubenswrapper[4804]: I0217 14:35:35.404352 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-content/0.log" Feb 17 14:35:35 crc kubenswrapper[4804]: I0217 14:35:35.409262 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-utilities/0.log" Feb 17 14:35:35 crc kubenswrapper[4804]: I0217 14:35:35.508983 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-content/0.log" Feb 17 14:35:35 crc kubenswrapper[4804]: I0217 14:35:35.585881 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/registry-server/0.log" Feb 17 14:35:35 crc kubenswrapper[4804]: I0217 14:35:35.733530 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-content/0.log" Feb 17 14:35:35 crc kubenswrapper[4804]: I0217 14:35:35.768416 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-utilities/0.log" Feb 17 14:35:35 crc kubenswrapper[4804]: I0217 14:35:35.963781 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/util/0.log" Feb 17 14:35:36 crc kubenswrapper[4804]: I0217 14:35:36.168893 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/util/0.log" Feb 17 14:35:36 crc kubenswrapper[4804]: I0217 14:35:36.232512 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/pull/0.log" Feb 17 14:35:36 crc kubenswrapper[4804]: I0217 14:35:36.244496 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/pull/0.log" Feb 17 14:35:36 crc kubenswrapper[4804]: I0217 14:35:36.445895 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/extract/0.log" Feb 17 14:35:36 crc kubenswrapper[4804]: I0217 14:35:36.481934 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/util/0.log" Feb 17 14:35:36 crc kubenswrapper[4804]: I0217 14:35:36.485032 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/pull/0.log" Feb 17 14:35:36 crc kubenswrapper[4804]: I0217 14:35:36.689256 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/registry-server/0.log" Feb 17 14:35:36 crc kubenswrapper[4804]: I0217 14:35:36.732843 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-26cwx_78a56ea9-6641-4d2d-8471-b40e5f2cf7e5/marketplace-operator/0.log" Feb 17 14:35:36 crc kubenswrapper[4804]: I0217 14:35:36.883245 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-utilities/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.051321 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-utilities/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.059580 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-content/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.073058 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-content/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.258799 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-utilities/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.298901 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-content/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.427327 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/registry-server/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.444106 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zn8mk_3aa554a7-2c33-433d-89c1-403c44aa0215/extract-utilities/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.621755 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zn8mk_3aa554a7-2c33-433d-89c1-403c44aa0215/extract-utilities/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.658724 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zn8mk_3aa554a7-2c33-433d-89c1-403c44aa0215/extract-content/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.660806 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zn8mk_3aa554a7-2c33-433d-89c1-403c44aa0215/extract-content/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.821927 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zn8mk_3aa554a7-2c33-433d-89c1-403c44aa0215/extract-content/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.829516 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zn8mk_3aa554a7-2c33-433d-89c1-403c44aa0215/extract-utilities/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.967590 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zn8mk_3aa554a7-2c33-433d-89c1-403c44aa0215/registry-server/0.log" Feb 17 14:36:44 crc kubenswrapper[4804]: I0217 14:36:44.985573 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8cjvt"] Feb 17 14:36:44 crc kubenswrapper[4804]: E0217 14:36:44.986991 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" containerName="extract-utilities" Feb 17 14:36:44 crc kubenswrapper[4804]: I0217 14:36:44.987008 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" containerName="extract-utilities" Feb 17 14:36:44 crc kubenswrapper[4804]: E0217 14:36:44.987029 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" containerName="extract-content" Feb 17 14:36:44 crc kubenswrapper[4804]: I0217 14:36:44.987037 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" containerName="extract-content" Feb 17 14:36:44 crc kubenswrapper[4804]: E0217 14:36:44.987078 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" containerName="registry-server" Feb 17 14:36:44 crc kubenswrapper[4804]: I0217 14:36:44.987089 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" containerName="registry-server" Feb 17 14:36:44 crc kubenswrapper[4804]: I0217 14:36:44.987395 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" containerName="registry-server" Feb 17 14:36:44 crc kubenswrapper[4804]: I0217 14:36:44.988933 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.000545 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8cjvt"] Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.157000 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-catalog-content\") pod \"certified-operators-8cjvt\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.157103 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pbc5\" (UniqueName: \"kubernetes.io/projected/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-kube-api-access-7pbc5\") pod \"certified-operators-8cjvt\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.157154 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-utilities\") pod \"certified-operators-8cjvt\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.258990 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pbc5\" (UniqueName: \"kubernetes.io/projected/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-kube-api-access-7pbc5\") pod \"certified-operators-8cjvt\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.259464 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-utilities\") pod \"certified-operators-8cjvt\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.259889 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-catalog-content\") pod \"certified-operators-8cjvt\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.259987 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-utilities\") pod \"certified-operators-8cjvt\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.260695 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-catalog-content\") pod \"certified-operators-8cjvt\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.295730 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pbc5\" (UniqueName: \"kubernetes.io/projected/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-kube-api-access-7pbc5\") pod \"certified-operators-8cjvt\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.325976 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.863149 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8cjvt"] Feb 17 14:36:46 crc kubenswrapper[4804]: E0217 14:36:46.249526 4804 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24bb8e4f_1bc0_4422_877c_3b9f26a4ded0.slice/crio-conmon-502702fc4c5f477b627accb815f1a4f523409bd0ef1fb1794ca8a6b6389d9199.scope\": RecentStats: unable to find data in memory cache]" Feb 17 14:36:46 crc kubenswrapper[4804]: I0217 14:36:46.395417 4804 generic.go:334] "Generic (PLEG): container finished" podID="24bb8e4f-1bc0-4422-877c-3b9f26a4ded0" containerID="502702fc4c5f477b627accb815f1a4f523409bd0ef1fb1794ca8a6b6389d9199" exitCode=0 Feb 17 14:36:46 crc kubenswrapper[4804]: I0217 14:36:46.395510 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cjvt" event={"ID":"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0","Type":"ContainerDied","Data":"502702fc4c5f477b627accb815f1a4f523409bd0ef1fb1794ca8a6b6389d9199"} Feb 17 14:36:46 crc kubenswrapper[4804]: I0217 14:36:46.395826 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cjvt" event={"ID":"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0","Type":"ContainerStarted","Data":"bfe6afe0591c71ea93e7728ff8ca0d1783e36d1a1e8760a729d643c649ba52b5"} Feb 17 14:36:46 crc kubenswrapper[4804]: I0217 14:36:46.398303 4804 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:36:52 crc kubenswrapper[4804]: I0217 14:36:52.454724 4804 generic.go:334] "Generic (PLEG): container finished" podID="24bb8e4f-1bc0-4422-877c-3b9f26a4ded0" containerID="0bcf1df09a991fc2dc58734e56b601ca1908332ce3dee158c9bd719c118aac29" exitCode=0 Feb 17 14:36:52 crc kubenswrapper[4804]: I0217 14:36:52.454809 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cjvt" event={"ID":"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0","Type":"ContainerDied","Data":"0bcf1df09a991fc2dc58734e56b601ca1908332ce3dee158c9bd719c118aac29"} Feb 17 14:36:53 crc kubenswrapper[4804]: I0217 14:36:53.464930 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cjvt" event={"ID":"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0","Type":"ContainerStarted","Data":"c26f8a66dc3f1a3e4f6b2b927be5f18e96692a3cd83154e44f53aae9783d2efd"} Feb 17 14:36:53 crc kubenswrapper[4804]: I0217 14:36:53.490830 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8cjvt" podStartSLOduration=3.039248967 podStartE2EDuration="9.490806715s" podCreationTimestamp="2026-02-17 14:36:44 +0000 UTC" firstStartedPulling="2026-02-17 14:36:46.398061542 +0000 UTC m=+4280.509480879" lastFinishedPulling="2026-02-17 14:36:52.84961927 +0000 UTC m=+4286.961038627" observedRunningTime="2026-02-17 14:36:53.482325669 +0000 UTC m=+4287.593745016" watchObservedRunningTime="2026-02-17 14:36:53.490806715 +0000 UTC m=+4287.602226062" Feb 17 14:36:55 crc kubenswrapper[4804]: I0217 14:36:55.326939 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:55 crc kubenswrapper[4804]: I0217 14:36:55.327341 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:55 crc kubenswrapper[4804]: I0217 14:36:55.402971 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:55 crc kubenswrapper[4804]: I0217 14:36:55.835023 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:36:55 crc kubenswrapper[4804]: I0217 14:36:55.835095 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:37:05 crc kubenswrapper[4804]: I0217 14:37:05.388380 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:37:05 crc kubenswrapper[4804]: I0217 14:37:05.456917 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8cjvt"] Feb 17 14:37:05 crc kubenswrapper[4804]: I0217 14:37:05.574615 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8cjvt" podUID="24bb8e4f-1bc0-4422-877c-3b9f26a4ded0" containerName="registry-server" containerID="cri-o://c26f8a66dc3f1a3e4f6b2b927be5f18e96692a3cd83154e44f53aae9783d2efd" gracePeriod=2 Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.587099 4804 generic.go:334] "Generic (PLEG): container finished" podID="24bb8e4f-1bc0-4422-877c-3b9f26a4ded0" containerID="c26f8a66dc3f1a3e4f6b2b927be5f18e96692a3cd83154e44f53aae9783d2efd" exitCode=0 Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.598339 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cjvt" event={"ID":"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0","Type":"ContainerDied","Data":"c26f8a66dc3f1a3e4f6b2b927be5f18e96692a3cd83154e44f53aae9783d2efd"} Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.692415 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.790717 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-utilities\") pod \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.790957 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pbc5\" (UniqueName: \"kubernetes.io/projected/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-kube-api-access-7pbc5\") pod \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.791002 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-catalog-content\") pod \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.791424 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-utilities" (OuterVolumeSpecName: "utilities") pod "24bb8e4f-1bc0-4422-877c-3b9f26a4ded0" (UID: "24bb8e4f-1bc0-4422-877c-3b9f26a4ded0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.798654 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-kube-api-access-7pbc5" (OuterVolumeSpecName: "kube-api-access-7pbc5") pod "24bb8e4f-1bc0-4422-877c-3b9f26a4ded0" (UID: "24bb8e4f-1bc0-4422-877c-3b9f26a4ded0"). InnerVolumeSpecName "kube-api-access-7pbc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.870114 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24bb8e4f-1bc0-4422-877c-3b9f26a4ded0" (UID: "24bb8e4f-1bc0-4422-877c-3b9f26a4ded0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.893214 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.893484 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pbc5\" (UniqueName: \"kubernetes.io/projected/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-kube-api-access-7pbc5\") on node \"crc\" DevicePath \"\"" Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.893557 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:37:07 crc kubenswrapper[4804]: I0217 14:37:07.598999 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cjvt" event={"ID":"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0","Type":"ContainerDied","Data":"bfe6afe0591c71ea93e7728ff8ca0d1783e36d1a1e8760a729d643c649ba52b5"} Feb 17 14:37:07 crc kubenswrapper[4804]: I0217 14:37:07.599057 4804 scope.go:117] "RemoveContainer" containerID="c26f8a66dc3f1a3e4f6b2b927be5f18e96692a3cd83154e44f53aae9783d2efd" Feb 17 14:37:07 crc kubenswrapper[4804]: I0217 14:37:07.599253 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:37:07 crc kubenswrapper[4804]: I0217 14:37:07.622446 4804 scope.go:117] "RemoveContainer" containerID="0bcf1df09a991fc2dc58734e56b601ca1908332ce3dee158c9bd719c118aac29" Feb 17 14:37:07 crc kubenswrapper[4804]: I0217 14:37:07.665439 4804 scope.go:117] "RemoveContainer" containerID="502702fc4c5f477b627accb815f1a4f523409bd0ef1fb1794ca8a6b6389d9199" Feb 17 14:37:07 crc kubenswrapper[4804]: I0217 14:37:07.674158 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8cjvt"] Feb 17 14:37:07 crc kubenswrapper[4804]: I0217 14:37:07.684253 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8cjvt"] Feb 17 14:37:08 crc kubenswrapper[4804]: I0217 14:37:08.585021 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24bb8e4f-1bc0-4422-877c-3b9f26a4ded0" path="/var/lib/kubelet/pods/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0/volumes" Feb 17 14:37:25 crc kubenswrapper[4804]: I0217 14:37:25.790481 4804 generic.go:334] "Generic (PLEG): container finished" podID="de9029fd-fb98-4bf0-a6fc-0baf663a4e92" containerID="ba8a99b1d53310cd598e93015ecdc8bff1c0871f8d9af2216aa4262da6b1fde1" exitCode=0 Feb 17 14:37:25 crc kubenswrapper[4804]: I0217 14:37:25.790573 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mdsd/must-gather-64wjc" event={"ID":"de9029fd-fb98-4bf0-a6fc-0baf663a4e92","Type":"ContainerDied","Data":"ba8a99b1d53310cd598e93015ecdc8bff1c0871f8d9af2216aa4262da6b1fde1"} Feb 17 14:37:25 crc kubenswrapper[4804]: I0217 14:37:25.792174 4804 scope.go:117] "RemoveContainer" containerID="ba8a99b1d53310cd598e93015ecdc8bff1c0871f8d9af2216aa4262da6b1fde1" Feb 17 14:37:25 crc kubenswrapper[4804]: I0217 14:37:25.834975 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:37:25 crc kubenswrapper[4804]: I0217 14:37:25.835066 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:37:26 crc kubenswrapper[4804]: I0217 14:37:26.472809 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6mdsd_must-gather-64wjc_de9029fd-fb98-4bf0-a6fc-0baf663a4e92/gather/0.log" Feb 17 14:37:37 crc kubenswrapper[4804]: I0217 14:37:37.758602 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6mdsd/must-gather-64wjc"] Feb 17 14:37:37 crc kubenswrapper[4804]: I0217 14:37:37.759309 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-6mdsd/must-gather-64wjc" podUID="de9029fd-fb98-4bf0-a6fc-0baf663a4e92" containerName="copy" containerID="cri-o://8a5d5495b17851f93d14861ab3120bb7a96ba669e31d998c9788362daafedc67" gracePeriod=2 Feb 17 14:37:37 crc kubenswrapper[4804]: I0217 14:37:37.776275 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6mdsd/must-gather-64wjc"] Feb 17 14:37:37 crc kubenswrapper[4804]: I0217 14:37:37.916709 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6mdsd_must-gather-64wjc_de9029fd-fb98-4bf0-a6fc-0baf663a4e92/copy/0.log" Feb 17 14:37:37 crc kubenswrapper[4804]: I0217 14:37:37.917815 4804 generic.go:334] "Generic (PLEG): container finished" podID="de9029fd-fb98-4bf0-a6fc-0baf663a4e92" containerID="8a5d5495b17851f93d14861ab3120bb7a96ba669e31d998c9788362daafedc67" exitCode=143 Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.176658 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6mdsd_must-gather-64wjc_de9029fd-fb98-4bf0-a6fc-0baf663a4e92/copy/0.log" Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.177667 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/must-gather-64wjc" Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.295099 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xth5r\" (UniqueName: \"kubernetes.io/projected/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-kube-api-access-xth5r\") pod \"de9029fd-fb98-4bf0-a6fc-0baf663a4e92\" (UID: \"de9029fd-fb98-4bf0-a6fc-0baf663a4e92\") " Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.295260 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-must-gather-output\") pod \"de9029fd-fb98-4bf0-a6fc-0baf663a4e92\" (UID: \"de9029fd-fb98-4bf0-a6fc-0baf663a4e92\") " Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.303531 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-kube-api-access-xth5r" (OuterVolumeSpecName: "kube-api-access-xth5r") pod "de9029fd-fb98-4bf0-a6fc-0baf663a4e92" (UID: "de9029fd-fb98-4bf0-a6fc-0baf663a4e92"). InnerVolumeSpecName "kube-api-access-xth5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.403049 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xth5r\" (UniqueName: \"kubernetes.io/projected/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-kube-api-access-xth5r\") on node \"crc\" DevicePath \"\"" Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.447515 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "de9029fd-fb98-4bf0-a6fc-0baf663a4e92" (UID: "de9029fd-fb98-4bf0-a6fc-0baf663a4e92"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.504803 4804 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.586618 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de9029fd-fb98-4bf0-a6fc-0baf663a4e92" path="/var/lib/kubelet/pods/de9029fd-fb98-4bf0-a6fc-0baf663a4e92/volumes" Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.933012 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6mdsd_must-gather-64wjc_de9029fd-fb98-4bf0-a6fc-0baf663a4e92/copy/0.log" Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.934706 4804 scope.go:117] "RemoveContainer" containerID="8a5d5495b17851f93d14861ab3120bb7a96ba669e31d998c9788362daafedc67" Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.934905 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/must-gather-64wjc" Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.959647 4804 scope.go:117] "RemoveContainer" containerID="ba8a99b1d53310cd598e93015ecdc8bff1c0871f8d9af2216aa4262da6b1fde1" Feb 17 14:37:55 crc kubenswrapper[4804]: I0217 14:37:55.836155 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:37:55 crc kubenswrapper[4804]: I0217 14:37:55.836973 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:37:55 crc kubenswrapper[4804]: I0217 14:37:55.837046 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 14:37:55 crc kubenswrapper[4804]: I0217 14:37:55.838457 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9de6f8932aa6eb9745d883c27729ffc7b5e517e2da23231504aaed3b733a9ff5"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:37:55 crc kubenswrapper[4804]: I0217 14:37:55.838579 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://9de6f8932aa6eb9745d883c27729ffc7b5e517e2da23231504aaed3b733a9ff5" gracePeriod=600 Feb 17 14:37:56 crc kubenswrapper[4804]: I0217 14:37:56.100273 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="9de6f8932aa6eb9745d883c27729ffc7b5e517e2da23231504aaed3b733a9ff5" exitCode=0 Feb 17 14:37:56 crc kubenswrapper[4804]: I0217 14:37:56.100386 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"9de6f8932aa6eb9745d883c27729ffc7b5e517e2da23231504aaed3b733a9ff5"} Feb 17 14:37:56 crc kubenswrapper[4804]: I0217 14:37:56.100530 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:37:57 crc kubenswrapper[4804]: I0217 14:37:57.114267 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"34f2b4f4028dda4d42ec0945680072c7bcde40409024f0a26b6628c15a828e33"} Feb 17 14:40:25 crc kubenswrapper[4804]: I0217 14:40:25.835186 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:40:25 crc kubenswrapper[4804]: I0217 14:40:25.835819 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:40:55 crc kubenswrapper[4804]: I0217 14:40:55.835504 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:40:55 crc kubenswrapper[4804]: I0217 14:40:55.837135 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:41:25 crc kubenswrapper[4804]: I0217 14:41:25.835318 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:41:25 crc kubenswrapper[4804]: I0217 14:41:25.836064 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:41:25 crc kubenswrapper[4804]: I0217 14:41:25.836133 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 14:41:25 crc kubenswrapper[4804]: I0217 14:41:25.837450 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"34f2b4f4028dda4d42ec0945680072c7bcde40409024f0a26b6628c15a828e33"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:41:25 crc kubenswrapper[4804]: I0217 14:41:25.837695 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://34f2b4f4028dda4d42ec0945680072c7bcde40409024f0a26b6628c15a828e33" gracePeriod=600 Feb 17 14:41:25 crc kubenswrapper[4804]: E0217 14:41:25.965540 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:41:26 crc kubenswrapper[4804]: I0217 14:41:26.295798 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="34f2b4f4028dda4d42ec0945680072c7bcde40409024f0a26b6628c15a828e33" exitCode=0 Feb 17 14:41:26 crc kubenswrapper[4804]: I0217 14:41:26.295854 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"34f2b4f4028dda4d42ec0945680072c7bcde40409024f0a26b6628c15a828e33"} Feb 17 14:41:26 crc kubenswrapper[4804]: I0217 14:41:26.295936 4804 scope.go:117] "RemoveContainer" containerID="9de6f8932aa6eb9745d883c27729ffc7b5e517e2da23231504aaed3b733a9ff5" Feb 17 14:41:26 crc kubenswrapper[4804]: I0217 14:41:26.298618 4804 scope.go:117] "RemoveContainer" containerID="34f2b4f4028dda4d42ec0945680072c7bcde40409024f0a26b6628c15a828e33" Feb 17 14:41:26 crc kubenswrapper[4804]: E0217 14:41:26.299185 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:41:39 crc kubenswrapper[4804]: I0217 14:41:39.574680 4804 scope.go:117] "RemoveContainer" containerID="34f2b4f4028dda4d42ec0945680072c7bcde40409024f0a26b6628c15a828e33" Feb 17 14:41:39 crc kubenswrapper[4804]: E0217 14:41:39.575925 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:41:50 crc kubenswrapper[4804]: I0217 14:41:50.574756 4804 scope.go:117] "RemoveContainer" containerID="34f2b4f4028dda4d42ec0945680072c7bcde40409024f0a26b6628c15a828e33" Feb 17 14:41:50 crc kubenswrapper[4804]: E0217 14:41:50.575600 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515145077105024452 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015145077106017370 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015145065601016510 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015145065601015460 5ustar corecore